<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.sternwarte.uni-erlangen.de/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Gokus</id>
	<title>Remeis-Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.sternwarte.uni-erlangen.de/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Gokus"/>
	<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php/Special:Contributions/Gokus"/>
	<updated>2026-04-08T18:29:01Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.35.7</generator>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=3633</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=3633"/>
		<updated>2025-03-17T19:24:03Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. &lt;br /&gt;
At Remeis, analysis scripts written with ''fermipy'' are available to create either a spectrum or a light curve. These can be found at $FERMITOOLS, which links to /data/software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is the recommended choice.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 (recommended) - Using the existing conda environment (Fermitools 2.2.0 &amp;amp; fermipy version 1.3.1) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda3/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy3 &amp;quot;conda activate fermipy_3&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing this conda installation by default!'''&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda3/etc/profile.d/conda.sh&lt;br /&gt;
alias fermipy3 ='conda activate /userdata/data/gokus/conda/miniconda3/envs/fermipy_3'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy3''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
=== If you want to use the old conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
In an earlier version, the analysis pipeline ran on Python 2.7 and the most recent versions of the Fermitools and fermipy that were compatible with Python 2.7. It is recommended to use the newest version, however, the former scripts compatible with the old conda enviroment exist within a folder at $FERMITOOLS. If you want to use the old version, please reach out to andrea.gokus[at]fau.de.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/home/X-ray/Fermi/archive/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/home/X-ray/Fermi/archive/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /home/X-ray/Fermi/archive/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/home/X-ray/Fermi/archive/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/home/X-ray/Fermi/archive/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see Remeis Wiki page)]&lt;br /&gt;
  Note: Cannot be used to search for unknown source close to a known 4FGL source.&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? If set to NO and a 4FGL source is&lt;br /&gt;
 			closer than 0.5 deg, the analysis will run on that close target! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --fix=yes/no         Fixes parameters for all sources to 4FGL values. Only to be used to check for&lt;br /&gt;
                        detection of new sources! Default: NO&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&lt;br /&gt;
   --myscript=path	Path to your own script if you do not want to use the standard Fermi scripts.&lt;br /&gt;
			Path needs to include the file name. Default: Using scripts in $FERMITOOLS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see Remeis Wiki page)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for the analysis&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for the analysis&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --fix=yes/no     Fixes parameters for all sources to 4FGL values. Only to be used to check for&lt;br /&gt;
			detection of new sources! Default: NO&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name, RAJ2000, and DEJ2000 for&lt;br /&gt;
			name and coordinates. If '--ra' and '--dec' are also given, the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&lt;br /&gt;
   --myscript=path	Path to your own script if you do not want to use the standard Fermi scripts.&lt;br /&gt;
			Path needs to include the file name. NOTE: Only implemented for spectra, not LC calculation!&lt;br /&gt;
			Default: Using scripts in $FERMITOOLS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. I recommend to write a script to create the .slurm file that you want to submit to slurm. Here's an example how this array job can look like:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --job-name LATspectra&lt;br /&gt;
#SBATCH --output job.out-%a&lt;br /&gt;
#SBATCH --error err.out-%a&lt;br /&gt;
#SBATCH --time 48:00:00&lt;br /&gt;
#SBATCH --partition=remeis&lt;br /&gt;
#SBATCH --ntasks=1&lt;br /&gt;
#SBATCH --hint=nomultithread&lt;br /&gt;
#SBATCH --mem-per-cpu=4G&lt;br /&gt;
#SBATCH --mail-type=END,FAIL,ARRAY_TASKS&lt;br /&gt;
#SBATCH --mail-user=my@email.de&lt;br /&gt;
#SBATCH --array 0-XXX%YY (XXX: number of tasks in array, YY: number of jobs running simultaneously&lt;br /&gt;
&lt;br /&gt;
COMMAND[0]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJXXX.X-XXXX mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[1]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJYYY.Y-YYYY mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[2]=&amp;quot;...&amp;quot;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Can the scripts find possibly unknown gamma-ray sources by itself?''' - Yes, when running the analysis script to create a spectrum (with make_spectrum.sl), there is a routine implemented (fermipy function ''find_sources''), which searches for excesses in the TS map with a significance of 5 sigma or higher. The coordinates of a found source aren saved in the folder in a file called ''new_source.dat'', in which the analysis is run, to allow further inspections and a follow-up analyis.&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code, my work-around for this issue is implemented in the wrapper scripts ''make_spectrum.sl'' and ''make_lightcurve.sl''. Deleting the subfolders is now handled in there and not via the routine in the source code. &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. I have [https://github.com/fermiPy/fermipy/issues/405 | reported this issue to the developers], and it should be fixed now in the fermipy version 1.1.4 and up.&lt;br /&gt;
If you're using an older version of fermipy, you can use the following hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is near the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
If you encounter a new error, you can also go ahead and start the python script for the analysis directly in the shell. Remember to load your conda environment first. You also need a ''config_SED.yaml'' or ''config_LC.yaml'' file in the folder where you are going to run the analysis, otherwise you won't get far. You can directly run the script, e.g. via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(fermipy)[user@computer]&amp;gt; $FERMITOOLS/SEDcreator_fermipy.py&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve analysis fails because of too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: January 8, 2025&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=3549</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=3549"/>
		<updated>2025-01-08T21:32:29Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. &lt;br /&gt;
At Remeis, analysis scripts written with ''fermipy'' are available to create either a spectrum or a light curve. These can be found at $FERMITOOLS, which links to /data/software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 (recommended) - Using the existing conda environment (Fermitools 2.2.0 &amp;amp; fermipy version 1.3.1) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda3/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy3 &amp;quot;conda activate fermipy_3&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing this conda installation by default!'''&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.sh&lt;br /&gt;
alias fermipy3 ='conda activate /userdata/data/gokus/conda/miniconda3/envs/fermipy_3'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy3''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
=== If you want to use the old conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
In an earlier version, the analysis pipeline ran on Python 2.7 and the most recent versions of the Fermitools and fermipy that were compatible with Python 2.7. It is recommended to use the newest version, however, the former scripts compatible with the old conda enviroment exist within a folder at $FERMITOOLS. If you want to use the old version, please reach out to andrea.gokus[at]fau.de.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/home/X-ray/Fermi/archive/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/home/X-ray/Fermi/archive/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /home/X-ray/Fermi/archive/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/home/X-ray/Fermi/archive/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/home/X-ray/Fermi/archive/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see Remeis Wiki page)]&lt;br /&gt;
  Note: Cannot be used to search for unknown source close to a known 4FGL source.&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? If set to NO and a 4FGL source is&lt;br /&gt;
 			closer than 0.5 deg, the analysis will run on that close target! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --fix=yes/no         Fixes parameters for all sources to 4FGL values. Only to be used to check for&lt;br /&gt;
                        detection of new sources! Default: NO&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&lt;br /&gt;
   --myscript=path	Path to your own script if you do not want to use the standard Fermi scripts.&lt;br /&gt;
			Path needs to include the file name. Default: Using scripts in $FERMITOOLS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see Remeis Wiki page)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for the analysis&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for the analysis&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --fix=yes/no     Fixes parameters for all sources to 4FGL values. Only to be used to check for&lt;br /&gt;
			detection of new sources! Default: NO&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name, RAJ2000, and DEJ2000 for&lt;br /&gt;
			name and coordinates. If '--ra' and '--dec' are also given, the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&lt;br /&gt;
   --myscript=path	Path to your own script if you do not want to use the standard Fermi scripts.&lt;br /&gt;
			Path needs to include the file name. NOTE: Only implemented for spectra, not LC calculation!&lt;br /&gt;
			Default: Using scripts in $FERMITOOLS&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. I recommend to write a script to create the .slurm file that you want to submit to slurm. Here's an example how this array job can look like:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --job-name LATspectra&lt;br /&gt;
#SBATCH --output job.out-%a&lt;br /&gt;
#SBATCH --error err.out-%a&lt;br /&gt;
#SBATCH --time 48:00:00&lt;br /&gt;
#SBATCH --partition=remeis&lt;br /&gt;
#SBATCH --ntasks=1&lt;br /&gt;
#SBATCH --hint=nomultithread&lt;br /&gt;
#SBATCH --mem-per-cpu=4G&lt;br /&gt;
#SBATCH --mail-type=END,FAIL,ARRAY_TASKS&lt;br /&gt;
#SBATCH --mail-user=my@email.de&lt;br /&gt;
#SBATCH --array 0-XXX%YY (XXX: number of tasks in array, YY: number of jobs running simultaneously&lt;br /&gt;
&lt;br /&gt;
COMMAND[0]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJXXX.X-XXXX mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[1]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJYYY.Y-YYYY mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[2]=&amp;quot;...&amp;quot;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Can the scripts find possibly unknown gamma-ray sources by itself?''' - Yes, when running the analysis script to create a spectrum (with make_spectrum.sl), there is a routine implemented (fermipy function ''find_sources''), which searches for excesses in the TS map with a significance of 5 sigma or higher. The coordinates of a found source aren saved in the folder in a file called ''new_source.dat'', in which the analysis is run, to allow further inspections and a follow-up analyis.&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code, my work-around for this issue is implemented in the wrapper scripts ''make_spectrum.sl'' and ''make_lightcurve.sl''. Deleting the subfolders is now handled in there and not via the routine in the source code. &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. I have [https://github.com/fermiPy/fermipy/issues/405 | reported this issue to the developers], and it should be fixed now in the fermipy version 1.1.4 and up.&lt;br /&gt;
If you're using an older version of fermipy, you can use the following hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is near the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
If you encounter a new error, you can also go ahead and start the python script for the analysis directly in the shell. Remember to load your conda environment first. You also need a ''config_SED.yaml'' or ''config_LC.yaml'' file in the folder where you are going to run the analysis, otherwise you won't get far. You can directly run the script, e.g. via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(fermipy)[user@computer]&amp;gt; $FERMITOOLS/SEDcreator_fermipy.py&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve analysis fails because of too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: January 8, 2025&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Skivergnuegen_2023&amp;diff=2752</id>
		<title>Skivergnuegen 2023</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Skivergnuegen_2023&amp;diff=2752"/>
		<updated>2023-01-03T09:35:33Z</updated>

		<summary type="html">&lt;p&gt;Gokus: /* Fahrer */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;'''Remeis Skifahren 2023''' The same procedure as every year ;-) &lt;br /&gt;
Hier werden die wichtigsten organisatorischen Dinge festgehalten. Jeder darf hier gerne editieren; wer keinen Account für das Wiki hat, muss sich bitte an jemanden wenden, der Zugang hat.&lt;br /&gt;
&lt;br /&gt;
[[File:154854 silvretta-montafon-panorama.jpeg|right|700px]]&lt;br /&gt;
&lt;br /&gt;
== Allgemeine Infos ==&lt;br /&gt;
&lt;br /&gt;
*Ferienwohnung(en):&lt;br /&gt;
**Wir kommen wieder unter im [https://www.grandau.at/montafon/zimmer/ferienhaus-enzian Haus Enzian] des [https://www.grandau.at/ Sporthotels Grandau].&lt;br /&gt;
**Adresse des Sporthotel: Montafonerstraße 274a, 6791 St. Gallenkirch, Österreich. Unsere Unterkünfte liegen im Türkeiweg.&lt;br /&gt;
&lt;br /&gt;
*Skigebiet: Montafon (Vorarlberg)&lt;br /&gt;
**Skigebiet Karte: https://winter.intermaps.com/montafon?lang=de&lt;br /&gt;
**Skipass Preise: https://www.montafon.at/de/Service/Bergbahn-Preise-Tickets/Mehrtageskarte-Winter&lt;br /&gt;
**Wer nicht alle 7 Tage Skifahren möchte: Es gibt Angebote wie z.B. 5 aus 7, mit einem solchen Skipass kann man innerhalb von 7 Tagen an 5 beliebigen Tagen skifahren.&lt;br /&gt;
&lt;br /&gt;
*Skiverleih: [http://www.sportharry.at/ Sport Harry], direkt an der Talstation.&lt;br /&gt;
&lt;br /&gt;
== Unterkunft ==&lt;br /&gt;
=== Aufteilung ===&lt;br /&gt;
&lt;br /&gt;
Dieses Jahr gibt es nur die Hütte mit folgenden Zimmern (wir sind insgesamt 14, also bleiben keine Plätze frei).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Zimmer&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Insassen&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 2-er&lt;br /&gt;
| Amy, Aafia&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 3-er&lt;br /&gt;
| Jakob, Max, Katya&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 4-er&lt;br /&gt;
| Eva, Steffen, Flo, Christian&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 5-er&lt;br /&gt;
| Basti&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Fahrer ==&lt;br /&gt;
Fahrer können sich hier eintragen und Eckdaten angeben. Diejenige, die mitfahren möchten, sprechen sich mit den Fahrern ab und tragen sich ebenfalls ein. Klärt bitte auch die Gepäcklage, ggf. kann ein anderer Fahrer z.B. Skiausrüstung mitnehmen.&lt;br /&gt;
&lt;br /&gt;
'''Auf der Suche nach einer Mitfahrgelegenheit: '''&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Fahrer&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | # Plätze&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Anreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Mitfahrer Anreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Abreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Mitfahrer Abreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Skimitnahme&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Kommentar&lt;br /&gt;
|-&lt;br /&gt;
|Eva&lt;br /&gt;
|0(3)&lt;br /&gt;
|Samstag&lt;br /&gt;
|Steffen+Christian&lt;br /&gt;
|Samstag?&lt;br /&gt;
|Steffen+Christian&lt;br /&gt;
|eigene ja&lt;br /&gt;
|Auto ist voll&lt;br /&gt;
|-&lt;br /&gt;
|Max&lt;br /&gt;
|0(3)&lt;br /&gt;
|Samstag&lt;br /&gt;
|Jakob+Katya&lt;br /&gt;
|Samstag&lt;br /&gt;
|Jakob+Katya&lt;br /&gt;
|eigene ja&lt;br /&gt;
|Auto ist voll&lt;br /&gt;
|-&lt;br /&gt;
|Basti&lt;br /&gt;
|3(3)&lt;br /&gt;
|Sonntag ganz frueh&lt;br /&gt;
|&lt;br /&gt;
|Freitag Mittag&lt;br /&gt;
|&lt;br /&gt;
|Ja&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Schwarzes Brett ==&lt;br /&gt;
&lt;br /&gt;
=== Essen ===&lt;br /&gt;
Hier gibt's den Essensplan, soweit übernommen von den letzten Jahren. Da wir sehr viele Leute sind, sollten wir wieder vorab grob planen, was wir kochen wollen. Wir können einige Dinge in Deutschland besorgen und mitnehmen, es gib aber auch einen nahe gelegen Supermarkt. Generelle Dinge, wie z.B. Gewürze oder Aufstriche für Frühstücksbrötchen könnte man auch mitbringen.&lt;br /&gt;
&lt;br /&gt;
Wer Kochvorschläge oder andere Ideen hat, gerne unten eintragen und kommentieren. Bedenkt, dass die Zubereitung (relativ) einfach sein sollte. Da sich die Liste in den letzten Jahren gut bewaehrt hat, behalten wir sie bei, aber die Reihenfolge/Tage kann man sicher noch hin und her bewegen. Einige (wenige) Sachen müssen wir dann manchmal trotzdem noch beim Spar kaufen, da ja auch der Kühlschrank nur begrenzt Platz hat.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Essenplan:'''&lt;br /&gt;
*Samstag: Spaghetti Bolognese + veg. Bolognese&lt;br /&gt;
*Sonntag: Käse Spätzle&lt;br /&gt;
*Montag: Burritos&lt;br /&gt;
*Dienstag: Risotto&lt;br /&gt;
*Mittwoch: Gulasch&lt;br /&gt;
*Donnerstag: Kartoffeln mit Kräuter-Quark&lt;br /&gt;
*Freitag: Dal mit Reis  (od. Curry mit Suesskartoffeln )&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Einkaufsliste von 2022 (letztes Jahr): https://docs.google.com/document/d/166b_mIXxmoAgcv1upIGremd_36E1fmwwxsQADP2uEv8/&lt;br /&gt;
&lt;br /&gt;
@Christian: Vegane Option entwender selbst mitbringen oder hier mit reinschreiben (falls man die einfach bekommt)&lt;br /&gt;
&lt;br /&gt;
=== Alkohol/Party ===&lt;br /&gt;
&lt;br /&gt;
=== Spiele ===&lt;br /&gt;
 &lt;br /&gt;
Immer lustig sind gemeinsame Spieleabende. Also, wer im Besitz von Gesellschaftsspielen ist, bringt diese gerne mit! Um eine Übersicht zu bekommen auch bitte hier eintragen:&lt;br /&gt;
&lt;br /&gt;
''* Spiele hier eintragen''&lt;br /&gt;
&lt;br /&gt;
=== Sonstiges ===&lt;br /&gt;
&lt;br /&gt;
=== Alternativprogramm ===&lt;br /&gt;
  *  Rodeln (http://www.montafon.at/de/urlaubswelten/echte_naturliebhaber/rodeln)&lt;br /&gt;
  *  Schneeschuhwandern (http://www.montafon.at/schneeschuhwanderungen)&lt;br /&gt;
  *  Therme (http://www.montafon.at/schwimmen), z.B. http://www.aqua-dome.at/de (ca. 130km entfernt!)&lt;br /&gt;
&lt;br /&gt;
[[Category:Internal]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Skivergnuegen_2023&amp;diff=2751</id>
		<title>Skivergnuegen 2023</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Skivergnuegen_2023&amp;diff=2751"/>
		<updated>2023-01-03T09:34:57Z</updated>

		<summary type="html">&lt;p&gt;Gokus: /* Fahrer */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;'''Remeis Skifahren 2023''' The same procedure as every year ;-) &lt;br /&gt;
Hier werden die wichtigsten organisatorischen Dinge festgehalten. Jeder darf hier gerne editieren; wer keinen Account für das Wiki hat, muss sich bitte an jemanden wenden, der Zugang hat.&lt;br /&gt;
&lt;br /&gt;
[[File:154854 silvretta-montafon-panorama.jpeg|right|700px]]&lt;br /&gt;
&lt;br /&gt;
== Allgemeine Infos ==&lt;br /&gt;
&lt;br /&gt;
*Ferienwohnung(en):&lt;br /&gt;
**Wir kommen wieder unter im [https://www.grandau.at/montafon/zimmer/ferienhaus-enzian Haus Enzian] des [https://www.grandau.at/ Sporthotels Grandau].&lt;br /&gt;
**Adresse des Sporthotel: Montafonerstraße 274a, 6791 St. Gallenkirch, Österreich. Unsere Unterkünfte liegen im Türkeiweg.&lt;br /&gt;
&lt;br /&gt;
*Skigebiet: Montafon (Vorarlberg)&lt;br /&gt;
**Skigebiet Karte: https://winter.intermaps.com/montafon?lang=de&lt;br /&gt;
**Skipass Preise: https://www.montafon.at/de/Service/Bergbahn-Preise-Tickets/Mehrtageskarte-Winter&lt;br /&gt;
**Wer nicht alle 7 Tage Skifahren möchte: Es gibt Angebote wie z.B. 5 aus 7, mit einem solchen Skipass kann man innerhalb von 7 Tagen an 5 beliebigen Tagen skifahren.&lt;br /&gt;
&lt;br /&gt;
*Skiverleih: [http://www.sportharry.at/ Sport Harry], direkt an der Talstation.&lt;br /&gt;
&lt;br /&gt;
== Unterkunft ==&lt;br /&gt;
=== Aufteilung ===&lt;br /&gt;
&lt;br /&gt;
Dieses Jahr gibt es nur die Hütte mit folgenden Zimmern (wir sind insgesamt 14, also bleiben keine Plätze frei).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Zimmer&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Insassen&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 2-er&lt;br /&gt;
| Amy, Aafia&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 3-er&lt;br /&gt;
| Jakob, Max, Katya&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 4-er&lt;br /&gt;
| Eva, Steffen, Flo, Christian&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 5-er&lt;br /&gt;
| Basti&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Fahrer ==&lt;br /&gt;
Fahrer können sich hier eintragen und Eckdaten angeben. Diejenige, die mitfahren möchten, sprechen sich mit den Fahrern ab und tragen sich ebenfalls ein. Klärt bitte auch die Gepäcklage, ggf. kann ein anderer Fahrer z.B. Skiausrüstung mitnehmen.&lt;br /&gt;
&lt;br /&gt;
'''Auf der Suche nach einer Mitfahrgelegenheit: '''&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Fahrer&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | # Plätze&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Anreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Mitfahrer Anreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Abreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Mitfahrer Abreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Skimitnahme&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Kommentar&lt;br /&gt;
|-&lt;br /&gt;
|Eva&lt;br /&gt;
|0(3)&lt;br /&gt;
|Samstag&lt;br /&gt;
|Steffen+Christian&lt;br /&gt;
|Samstag?&lt;br /&gt;
|Steffen+Christian&lt;br /&gt;
|eigene ja&lt;br /&gt;
|Auto ist voll&lt;br /&gt;
|-&lt;br /&gt;
|Max&lt;br /&gt;
|0(3)&lt;br /&gt;
|Samstag&lt;br /&gt;
|Jakob+Katya&lt;br /&gt;
|Samstag&lt;br /&gt;
|Jakob+Katya&lt;br /&gt;
|eigene ja&lt;br /&gt;
|Auto ist voll&lt;br /&gt;
|-&lt;br /&gt;
|Basti&lt;br /&gt;
|3(3)&lt;br /&gt;
|&lt;br /&gt;
|Sonntag ganz frueh&lt;br /&gt;
|&lt;br /&gt;
|Freitag Mittag&lt;br /&gt;
|Ja&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Schwarzes Brett ==&lt;br /&gt;
&lt;br /&gt;
=== Essen ===&lt;br /&gt;
Hier gibt's den Essensplan, soweit übernommen von den letzten Jahren. Da wir sehr viele Leute sind, sollten wir wieder vorab grob planen, was wir kochen wollen. Wir können einige Dinge in Deutschland besorgen und mitnehmen, es gib aber auch einen nahe gelegen Supermarkt. Generelle Dinge, wie z.B. Gewürze oder Aufstriche für Frühstücksbrötchen könnte man auch mitbringen.&lt;br /&gt;
&lt;br /&gt;
Wer Kochvorschläge oder andere Ideen hat, gerne unten eintragen und kommentieren. Bedenkt, dass die Zubereitung (relativ) einfach sein sollte. Da sich die Liste in den letzten Jahren gut bewaehrt hat, behalten wir sie bei, aber die Reihenfolge/Tage kann man sicher noch hin und her bewegen. Einige (wenige) Sachen müssen wir dann manchmal trotzdem noch beim Spar kaufen, da ja auch der Kühlschrank nur begrenzt Platz hat.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Essenplan:'''&lt;br /&gt;
*Samstag: Spaghetti Bolognese + veg. Bolognese&lt;br /&gt;
*Sonntag: Käse Spätzle&lt;br /&gt;
*Montag: Burritos&lt;br /&gt;
*Dienstag: Risotto&lt;br /&gt;
*Mittwoch: Gulasch&lt;br /&gt;
*Donnerstag: Kartoffeln mit Kräuter-Quark&lt;br /&gt;
*Freitag: Dal mit Reis  (od. Curry mit Suesskartoffeln )&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Einkaufsliste von 2022 (letztes Jahr): https://docs.google.com/document/d/166b_mIXxmoAgcv1upIGremd_36E1fmwwxsQADP2uEv8/&lt;br /&gt;
&lt;br /&gt;
@Christian: Vegane Option entwender selbst mitbringen oder hier mit reinschreiben (falls man die einfach bekommt)&lt;br /&gt;
&lt;br /&gt;
=== Alkohol/Party ===&lt;br /&gt;
&lt;br /&gt;
=== Spiele ===&lt;br /&gt;
 &lt;br /&gt;
Immer lustig sind gemeinsame Spieleabende. Also, wer im Besitz von Gesellschaftsspielen ist, bringt diese gerne mit! Um eine Übersicht zu bekommen auch bitte hier eintragen:&lt;br /&gt;
&lt;br /&gt;
''* Spiele hier eintragen''&lt;br /&gt;
&lt;br /&gt;
=== Sonstiges ===&lt;br /&gt;
&lt;br /&gt;
=== Alternativprogramm ===&lt;br /&gt;
  *  Rodeln (http://www.montafon.at/de/urlaubswelten/echte_naturliebhaber/rodeln)&lt;br /&gt;
  *  Schneeschuhwandern (http://www.montafon.at/schneeschuhwanderungen)&lt;br /&gt;
  *  Therme (http://www.montafon.at/schwimmen), z.B. http://www.aqua-dome.at/de (ca. 130km entfernt!)&lt;br /&gt;
&lt;br /&gt;
[[Category:Internal]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Skivergnuegen_2023&amp;diff=2750</id>
		<title>Skivergnuegen 2023</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Skivergnuegen_2023&amp;diff=2750"/>
		<updated>2023-01-03T09:33:08Z</updated>

		<summary type="html">&lt;p&gt;Gokus: /* Aufteilung */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;'''Remeis Skifahren 2023''' The same procedure as every year ;-) &lt;br /&gt;
Hier werden die wichtigsten organisatorischen Dinge festgehalten. Jeder darf hier gerne editieren; wer keinen Account für das Wiki hat, muss sich bitte an jemanden wenden, der Zugang hat.&lt;br /&gt;
&lt;br /&gt;
[[File:154854 silvretta-montafon-panorama.jpeg|right|700px]]&lt;br /&gt;
&lt;br /&gt;
== Allgemeine Infos ==&lt;br /&gt;
&lt;br /&gt;
*Ferienwohnung(en):&lt;br /&gt;
**Wir kommen wieder unter im [https://www.grandau.at/montafon/zimmer/ferienhaus-enzian Haus Enzian] des [https://www.grandau.at/ Sporthotels Grandau].&lt;br /&gt;
**Adresse des Sporthotel: Montafonerstraße 274a, 6791 St. Gallenkirch, Österreich. Unsere Unterkünfte liegen im Türkeiweg.&lt;br /&gt;
&lt;br /&gt;
*Skigebiet: Montafon (Vorarlberg)&lt;br /&gt;
**Skigebiet Karte: https://winter.intermaps.com/montafon?lang=de&lt;br /&gt;
**Skipass Preise: https://www.montafon.at/de/Service/Bergbahn-Preise-Tickets/Mehrtageskarte-Winter&lt;br /&gt;
**Wer nicht alle 7 Tage Skifahren möchte: Es gibt Angebote wie z.B. 5 aus 7, mit einem solchen Skipass kann man innerhalb von 7 Tagen an 5 beliebigen Tagen skifahren.&lt;br /&gt;
&lt;br /&gt;
*Skiverleih: [http://www.sportharry.at/ Sport Harry], direkt an der Talstation.&lt;br /&gt;
&lt;br /&gt;
== Unterkunft ==&lt;br /&gt;
=== Aufteilung ===&lt;br /&gt;
&lt;br /&gt;
Dieses Jahr gibt es nur die Hütte mit folgenden Zimmern (wir sind insgesamt 14, also bleiben keine Plätze frei).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Zimmer&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Insassen&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 2-er&lt;br /&gt;
| Amy, Aafia&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 3-er&lt;br /&gt;
| Jakob, Max, Katya&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 4-er&lt;br /&gt;
| Eva, Steffen, Flo, Christian&lt;br /&gt;
|-&lt;br /&gt;
! scope=&amp;quot;row&amp;quot; | 5-er&lt;br /&gt;
| Basti&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Fahrer ==&lt;br /&gt;
Fahrer können sich hier eintragen und Eckdaten angeben. Diejenige, die mitfahren möchten, sprechen sich mit den Fahrern ab und tragen sich ebenfalls ein. Klärt bitte auch die Gepäcklage, ggf. kann ein anderer Fahrer z.B. Skiausrüstung mitnehmen.&lt;br /&gt;
&lt;br /&gt;
'''Auf der Suche nach einer Mitfahrgelegenheit: '''&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Fahrer&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | # Plätze&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Anreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Mitfahrer Anreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Abreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Mitfahrer Abreise&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Skimitnahme&lt;br /&gt;
! scope=&amp;quot;col&amp;quot; | Kommentar&lt;br /&gt;
|-&lt;br /&gt;
|Eva&lt;br /&gt;
|0(3)&lt;br /&gt;
|Samstag&lt;br /&gt;
|Steffen+Christian&lt;br /&gt;
|Samstag?&lt;br /&gt;
|Steffen+Christian&lt;br /&gt;
|eigene ja&lt;br /&gt;
|Auto ist voll&lt;br /&gt;
|-&lt;br /&gt;
|Max&lt;br /&gt;
|0(3)&lt;br /&gt;
|Samstag&lt;br /&gt;
|Jakob+Katya&lt;br /&gt;
|Samstag&lt;br /&gt;
|Jakob+Katya&lt;br /&gt;
|eigene ja&lt;br /&gt;
|Auto ist voll&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Schwarzes Brett ==&lt;br /&gt;
&lt;br /&gt;
=== Essen ===&lt;br /&gt;
Hier gibt's den Essensplan, soweit übernommen von den letzten Jahren. Da wir sehr viele Leute sind, sollten wir wieder vorab grob planen, was wir kochen wollen. Wir können einige Dinge in Deutschland besorgen und mitnehmen, es gib aber auch einen nahe gelegen Supermarkt. Generelle Dinge, wie z.B. Gewürze oder Aufstriche für Frühstücksbrötchen könnte man auch mitbringen.&lt;br /&gt;
&lt;br /&gt;
Wer Kochvorschläge oder andere Ideen hat, gerne unten eintragen und kommentieren. Bedenkt, dass die Zubereitung (relativ) einfach sein sollte. Da sich die Liste in den letzten Jahren gut bewaehrt hat, behalten wir sie bei, aber die Reihenfolge/Tage kann man sicher noch hin und her bewegen. Einige (wenige) Sachen müssen wir dann manchmal trotzdem noch beim Spar kaufen, da ja auch der Kühlschrank nur begrenzt Platz hat.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Essenplan:'''&lt;br /&gt;
*Samstag: Spaghetti Bolognese + veg. Bolognese&lt;br /&gt;
*Sonntag: Käse Spätzle&lt;br /&gt;
*Montag: Burritos&lt;br /&gt;
*Dienstag: Risotto&lt;br /&gt;
*Mittwoch: Gulasch&lt;br /&gt;
*Donnerstag: Kartoffeln mit Kräuter-Quark&lt;br /&gt;
*Freitag: Dal mit Reis  (od. Curry mit Suesskartoffeln )&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Einkaufsliste von 2022 (letztes Jahr): https://docs.google.com/document/d/166b_mIXxmoAgcv1upIGremd_36E1fmwwxsQADP2uEv8/&lt;br /&gt;
&lt;br /&gt;
@Christian: Vegane Option entwender selbst mitbringen oder hier mit reinschreiben (falls man die einfach bekommt)&lt;br /&gt;
&lt;br /&gt;
=== Alkohol/Party ===&lt;br /&gt;
&lt;br /&gt;
=== Spiele ===&lt;br /&gt;
 &lt;br /&gt;
Immer lustig sind gemeinsame Spieleabende. Also, wer im Besitz von Gesellschaftsspielen ist, bringt diese gerne mit! Um eine Übersicht zu bekommen auch bitte hier eintragen:&lt;br /&gt;
&lt;br /&gt;
''* Spiele hier eintragen''&lt;br /&gt;
&lt;br /&gt;
=== Sonstiges ===&lt;br /&gt;
&lt;br /&gt;
=== Alternativprogramm ===&lt;br /&gt;
  *  Rodeln (http://www.montafon.at/de/urlaubswelten/echte_naturliebhaber/rodeln)&lt;br /&gt;
  *  Schneeschuhwandern (http://www.montafon.at/schneeschuhwanderungen)&lt;br /&gt;
  *  Therme (http://www.montafon.at/schwimmen), z.B. http://www.aqua-dome.at/de (ca. 130km entfernt!)&lt;br /&gt;
&lt;br /&gt;
[[Category:Internal]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Install_ISIS_on_Linux&amp;diff=2623</id>
		<title>Install ISIS on Linux</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Install_ISIS_on_Linux&amp;diff=2623"/>
		<updated>2022-07-21T08:30:27Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;These instructions were tested for Debian 10, Ubuntu 20.04, and openSUSE 15.3. They should help you to install both the X-ray isisscripts as well as the &amp;quot;stellar_isisscripts&amp;quot;, but will focus on the latter. It also assumes that you use the bash shell. Using the csh requires minor changes.&lt;br /&gt;
&lt;br /&gt;
If you prefer an easy way to run the X-ray isisscripts (without the &amp;quot;stellar_isisscripts&amp;quot;), you can use the singularity container:&lt;br /&gt;
&lt;br /&gt;
https://www.sternwarte.uni-erlangen.de/wiki/index.php/Isis_tutorial_installing&lt;br /&gt;
&lt;br /&gt;
== 1. Install basic dependencies and general things ==&lt;br /&gt;
&lt;br /&gt;
If you don't have root access, you might have to build some things from source, e.g. pgplot, cfitsio, or libpng. &lt;br /&gt;
However, it might be easier to rely on HEASOFT in this case.  &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sudo apt-get -y install libreadline-dev&lt;br /&gt;
sudo apt-get -y install libcurl4&lt;br /&gt;
sudo apt-get -y install libcurl4-gnutls-dev&lt;br /&gt;
sudo apt-get -y install libncurses5-dev&lt;br /&gt;
sudo apt-get -y install xorg-dev&lt;br /&gt;
sudo apt-get -y install gcc g++ gfortran&lt;br /&gt;
sudo apt-get -y install perl-modules-5*&lt;br /&gt;
sudo apt-get -y install python3-dev&lt;br /&gt;
sudo apt-get -y install fig2dev&lt;br /&gt;
sudo apt-get -y install libpng-dev &lt;br /&gt;
sudo apt-get -y install zlib1g-dev&lt;br /&gt;
sudo apt-get -y install libpcre3-dev&lt;br /&gt;
sudo apt-get -y install libonig-dev&lt;br /&gt;
sudo apt-get -y install pgplot5&lt;br /&gt;
sudo apt-get -y install wget&lt;br /&gt;
sudo apt-get -y install git&lt;br /&gt;
&amp;lt;/pre&amp;gt; &lt;br /&gt;
&lt;br /&gt;
You have to define some things in your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
export CC=/usr/bin/gcc&lt;br /&gt;
export CXX=/usr/bin/g++&lt;br /&gt;
export FC=/usr/bin/gfortran&lt;br /&gt;
export PERL=/usr/bin/perl&lt;br /&gt;
export PYTHON=/usr/bin/python3&lt;br /&gt;
&amp;lt;/pre&amp;gt; &lt;br /&gt;
&lt;br /&gt;
Then do &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== 2. Install HEASOFT ==&lt;br /&gt;
&lt;br /&gt;
This is ''not'' required for the &amp;quot;stellar_isisscripts&amp;quot; and can be skipped if you only want to install these. That is, unless you have no other way to install pgplot and cfitsio. HEASOFT seems to be required to run the X-ray isisscripts properly. To download HEASOFT, go to:&lt;br /&gt;
&lt;br /&gt;
https://heasarc.gsfc.nasa.gov/lheasoft/download.html&lt;br /&gt;
&lt;br /&gt;
unpack the download, and then:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd heasoft-6.29/BUILD_DIR&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Before I could run the configure script I had to make many files executable, &lt;br /&gt;
so it seems easiest to make all of them executable (this is not good practice):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
find .. -name '*' | xargs chmod +x&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Building HEASOFT will take a while (~&amp;gt;30min). If you don't want to use root, set a different installation location by using ./configure with &amp;lt;tt&amp;gt;'''--prefix=/path/to/installation/folder/'''&amp;lt;/tt&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
find .. -name 'headas-init.sh' | xargs chmod +x&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Finally, you need to define $HEASOFT. For me, this meant editing ~/.bashrc to &lt;br /&gt;
include the two following lines. You need to adjust the path of course:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
export HEADAS=/home/matti/programs/heasoft-6.29/x86_64-pc-linux-gnu-libc2.31&lt;br /&gt;
. $HEADAS/headas-init.sh&lt;br /&gt;
# or&lt;br /&gt;
# . $HEADAS/headas-init.sh&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
for csh, use instead:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
setenv HEADAS /home/volans/mdorsch/isis/heasoft/installation/x86_64-pc-linux-gnu-libc2.31&lt;br /&gt;
. $HEADAS/headas-init.csh&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in the terminal:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
# or source ~/.cshrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== 3. Install slang / slang modules / isis / isisscripts ==&lt;br /&gt;
&lt;br /&gt;
First get the latest versions of slang / jed/ slgsl / slxfig / slirp / isis / isisscripts. &lt;br /&gt;
A scripted installation of slang and isis is possible, following the MIT guide:&lt;br /&gt;
&lt;br /&gt;
https://space.mit.edu/cxc/isis/install.html#XI&lt;br /&gt;
&lt;br /&gt;
However, I recommend using the git repositories. Go to the Software directory in your terminal and clone the repositories:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
git clone https://github.com/houckj/isis&lt;br /&gt;
git clone git://git.jedsoft.org/git/slang.git&lt;br /&gt;
git clone git://git.jedsoft.org/git/slxfig.git&lt;br /&gt;
git clone git://git.jedsoft.org/git/slgsl.git&lt;br /&gt;
git clone git://git.jedsoft.org/git/jed.git&lt;br /&gt;
git clone git://git.jedsoft.org/git/slirp.git&lt;br /&gt;
git clone http://www.sternwarte.uni-erlangen.de/git.public/isisscripts &lt;br /&gt;
&amp;lt;/pre&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The following will assume you want to use root access. If that is not the case, you can use use the option&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
--prefix=/path/to/installation/folder/&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
With the ./configure  commands. It seems to work to install all of them to the same &amp;quot;--prefix&amp;quot;. &lt;br /&gt;
&lt;br /&gt;
Install slang. Go to the slang directory.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Make sure that all necessary standard modules are installed, including &amp;quot;png&amp;quot;, which requires &amp;quot;libpng16&amp;quot; to be installed. &lt;br /&gt;
If &amp;quot;libpng16&amp;quot; is not installed yet, you can easily build it from source:&lt;br /&gt;
&lt;br /&gt;
https://sourceforge.net/projects/libpng/files/&lt;br /&gt;
&lt;br /&gt;
and then add its installation folder to $PATH.&lt;br /&gt;
&lt;br /&gt;
Install isis. Go to the isis directory, here with HEADAS:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure --with-headas=$HEADAS&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
If your aim is to install the &amp;quot;stellar_isisscripts&amp;quot;, you don't need HEADAS, but may have to install fitsio:&lt;br /&gt;
&lt;br /&gt;
https://heasarc.gsfc.nasa.gov/fitsio/&lt;br /&gt;
&lt;br /&gt;
Then go to the isis directory and:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Add &amp;lt;tt&amp;gt;'''--with-slang=/path/to/slang'''&amp;lt;/tt&amp;gt; to the following configuration set ups in case you installed slang in a different path other than &amp;lt;tt&amp;gt;/usr/local/&amp;lt;/tt&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Install jed. Go to the jed directory&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
make xjed&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Install slgsl. Go to the slgsl directory&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Install slxfig. Go to the slxfig directory&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Install slirp. Go to the slirp directory&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Before building the isisscripts, install the &amp;quot;SLURP&amp;quot; perl module. &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sudo perl -MCPAN -e shell&lt;br /&gt;
install CPAN&lt;br /&gt;
reload cpan&lt;br /&gt;
install File::Slurp&lt;br /&gt;
quit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then go to the isisscripts directory and&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
make&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Create / modify the [[ISIS auto start (.isisrc)|.isisrc]] file in your &amp;lt;tt&amp;gt;$HOME&amp;lt;/tt&amp;gt; directory and add the following line&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
add_to_isis_load_path(&amp;quot;/Path/To/isisscripts/share/&amp;quot;);&lt;br /&gt;
define ris() { require(&amp;quot;isisscripts&amp;quot;); };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
This allows the easy access to &amp;lt;tt&amp;gt;require(&amp;quot;isisscripts&amp;quot;)&amp;lt;/tt&amp;gt; in isis.&lt;br /&gt;
&lt;br /&gt;
== 4. Install stellar_isisscripts ==&lt;br /&gt;
&lt;br /&gt;
Get the newest version of the stellar_isisscripts from git.&lt;br /&gt;
If you have a Remeis/gitlab account and are at the Remeis observatory, do:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
git clone git@serpens.sternwarte.uni-erlangen.de:irrgang/stellar.git&lt;br /&gt;
mv stellar stellar_isisscripts&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
If you have a Remeis/gitlab account and are not at the Remeis observatory, do:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
git clone http://www.sternwarte.uni-erlangen.de/gitlab/irrgang/stellar.git&lt;br /&gt;
mv stellar stellar_isisscripts&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Otherwise ask Matti Dorsch. Then:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd stellar_isisscripts&lt;br /&gt;
make &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now compile some necessary C functions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd stellar_isisscripts/slirp&lt;br /&gt;
slirp -make -lm -lgsl -lgslcblas -lpthread c_functions.h c_functions.o&lt;br /&gt;
sed -i -e 's/^CFLAGS[[:space:]]*= -g -O2/CFLAGS = -g -Ofast -Wall -Wextra/g' Makefile &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
make&lt;br /&gt;
make test&lt;br /&gt;
rm c_functions-test.sl ; rm c_functions.o ; rm c_functions_glue.o ; rm c_functions_glue.c ; rm Makefile ; rm *~ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Create modify the [[ISIS auto start (.isisrc)|.isisrc]] file in your &amp;lt;tt&amp;gt;$HOME&amp;lt;/tt&amp;gt; directory and add the following line&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
add_to_isis_load_path(&amp;quot;/Path/To/stellar_isisscripts/share/&amp;quot;);&lt;br /&gt;
define rmy() { require(&amp;quot;stellar_isisscripts&amp;quot;); };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To use all functions, you need the stilts tool, which is related to TOPCAT. You can install it as a package:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sudo apt-get install stilts&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Only if this does not work:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mkdir stilts; cd stilts&lt;br /&gt;
wget http://www.star.bris.ac.uk/~mbt/stilts/stilts.jar&lt;br /&gt;
wget 'http://www.star.bris.ac.uk/~mbt/stilts/stilts'&lt;br /&gt;
chmod +x stilts&lt;br /&gt;
sudo cp * /usr/bin/&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Without root, you can add the folder that contains stilts to your $PATH in ~/.bashrc or ~/.cshrc.&lt;br /&gt;
&lt;br /&gt;
To test if the scripts work, try for example:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
isis&lt;br /&gt;
rmy;&lt;br /&gt;
s = query_photometry (&amp;quot;HZ44&amp;quot;);&lt;br /&gt;
s.print();&lt;br /&gt;
help query_photometry&lt;br /&gt;
q&lt;br /&gt;
exit;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Install_ISIS_on_Linux&amp;diff=2622</id>
		<title>Install ISIS on Linux</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Install_ISIS_on_Linux&amp;diff=2622"/>
		<updated>2022-07-20T19:52:25Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;These instructions were tested for Debian 10, Ubuntu 20.04, and openSUSE 15.3. They should help you to install both the X-ray isisscripts as well as the &amp;quot;stellar_isisscripts&amp;quot;, but will focus on the latter. It also assumes that you use the bash shell. Using the csh requires minor changes.&lt;br /&gt;
&lt;br /&gt;
If you prefer an easy way to run the X-ray isisscripts (without the &amp;quot;stellar_isisscripts&amp;quot;), you can use the singularity container:&lt;br /&gt;
&lt;br /&gt;
https://www.sternwarte.uni-erlangen.de/wiki/index.php/Isis_tutorial_installing&lt;br /&gt;
&lt;br /&gt;
== 1. Install basic dependencies and general things ==&lt;br /&gt;
&lt;br /&gt;
If you don't have root access, you might have to build some things from source, e.g. pgplot, cfitsio, or libpng. &lt;br /&gt;
However, it might be easier to rely on HEASOFT in this case.  &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sudo apt-get -y install libreadline-dev&lt;br /&gt;
sudo apt-get -y install libcurl4&lt;br /&gt;
sudo apt-get -y install libcurl4-gnutls-dev&lt;br /&gt;
sudo apt-get -y install libncurses5-dev&lt;br /&gt;
sudo apt-get -y install libgsl-dev (at least for Ubuntu 22.04)&lt;br /&gt;
sudo apt-get -y install xorg-dev&lt;br /&gt;
sudo apt-get -y install gcc g++ gfortran&lt;br /&gt;
sudo apt-get -y install perl-modules-5*&lt;br /&gt;
sudo apt-get -y install python3-dev&lt;br /&gt;
sudo apt-get -y install fig2dev&lt;br /&gt;
sudo apt-get -y install libpng-dev &lt;br /&gt;
sudo apt-get -y install zlib1g-dev&lt;br /&gt;
sudo apt-get -y install libpcre3-dev&lt;br /&gt;
sudo apt-get -y install libonig-dev&lt;br /&gt;
sudo apt-get -y install pgplot5&lt;br /&gt;
sudo apt-get -y install wget&lt;br /&gt;
sudo apt-get -y install git&lt;br /&gt;
&amp;lt;/pre&amp;gt; &lt;br /&gt;
&lt;br /&gt;
You have to define some things in your ~/.bashrc:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
export CC=/usr/bin/gcc&lt;br /&gt;
export CXX=/usr/bin/g++&lt;br /&gt;
export FC=/usr/bin/gfortran&lt;br /&gt;
export PERL=/usr/bin/perl&lt;br /&gt;
export PYTHON=/usr/bin/python3&lt;br /&gt;
&amp;lt;/pre&amp;gt; &lt;br /&gt;
&lt;br /&gt;
Then do &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== 2. Install HEASOFT ==&lt;br /&gt;
&lt;br /&gt;
This is ''not'' required for the &amp;quot;stellar_isisscripts&amp;quot; and can be skipped if you only want to install these. That is, unless you have no other way to install pgplot and cfitsio. HEASOFT seems to be required to run the X-ray isisscripts properly. To download HEASOFT, go to:&lt;br /&gt;
&lt;br /&gt;
https://heasarc.gsfc.nasa.gov/lheasoft/download.html&lt;br /&gt;
&lt;br /&gt;
unpack the download, and then:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd heasoft-6.29/BUILD_DIR&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Before I could run the configure script I had to make many files executable, &lt;br /&gt;
so it seems easiest to make all of them executable (this is not good practice):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
find .. -name '*' | xargs chmod +x&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Building HEASOFT will take a while (~&amp;gt;30min). If you don't want to use root, set a different installation location by using ./configure with &amp;lt;tt&amp;gt;'''--prefix=/path/to/installation/folder/'''&amp;lt;/tt&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
find .. -name 'headas-init.sh' | xargs chmod +x&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Finally, you need to define $HEASOFT. For me, this meant editing ~/.bashrc to &lt;br /&gt;
include the two following lines. You need to adjust the path of course:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
export HEADAS=/home/matti/programs/heasoft-6.29/x86_64-pc-linux-gnu-libc2.31&lt;br /&gt;
. $HEADAS/headas-init.sh&lt;br /&gt;
# or&lt;br /&gt;
# . $HEADAS/headas-init.sh&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
for csh, use instead:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
setenv HEADAS /home/volans/mdorsch/isis/heasoft/installation/x86_64-pc-linux-gnu-libc2.31&lt;br /&gt;
. $HEADAS/headas-init.csh&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then, in the terminal:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
source ~/.bashrc&lt;br /&gt;
# or source ~/.cshrc&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== 3. Install slang / slang modules / isis / isisscripts ==&lt;br /&gt;
&lt;br /&gt;
First get the latest versions of slang / jed/ slgsl / slxfig / slirp / isis / isisscripts. &lt;br /&gt;
A scripted installation of slang and isis is possible, following the MIT guide:&lt;br /&gt;
&lt;br /&gt;
https://space.mit.edu/cxc/isis/install.html#XI&lt;br /&gt;
&lt;br /&gt;
However, I recommend using the git repositories. Go to the Software directory in your terminal and clone the repositories:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
git clone https://github.com/houckj/isis&lt;br /&gt;
git clone git://git.jedsoft.org/git/slang.git&lt;br /&gt;
git clone git://git.jedsoft.org/git/slxfig.git&lt;br /&gt;
git clone git://git.jedsoft.org/git/slgsl.git&lt;br /&gt;
git clone git://git.jedsoft.org/git/jed.git&lt;br /&gt;
git clone git://git.jedsoft.org/git/slirp.git&lt;br /&gt;
git clone http://www.sternwarte.uni-erlangen.de/git.public/isisscripts &lt;br /&gt;
&amp;lt;/pre&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The following will assume you want to use root access. If that is not the case, you can use use the option&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
--prefix=/path/to/installation/folder/&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
With the ./configure  commands. It seems to work to install all of them to the same &amp;quot;--prefix&amp;quot;. &lt;br /&gt;
&lt;br /&gt;
Install slang. Go to the slang directory.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Make sure that all necessary standard modules are installed, including &amp;quot;png&amp;quot;, which requires &amp;quot;libpng16&amp;quot; to be installed. &lt;br /&gt;
If &amp;quot;libpng16&amp;quot; is not installed yet, you can easily build it from source:&lt;br /&gt;
&lt;br /&gt;
https://sourceforge.net/projects/libpng/files/&lt;br /&gt;
&lt;br /&gt;
and then add its installation folder to $PATH.&lt;br /&gt;
&lt;br /&gt;
Install isis. Go to the isis directory, here with HEADAS:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure --with-headas=$HEADAS&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
If your aim is to install the &amp;quot;stellar_isisscripts&amp;quot;, you don't need HEADAS, but may have to install fitsio:&lt;br /&gt;
&lt;br /&gt;
https://heasarc.gsfc.nasa.gov/fitsio/&lt;br /&gt;
&lt;br /&gt;
Then go to the isis directory and:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Add &amp;lt;tt&amp;gt;'''--with-slang=/path/to/slang'''&amp;lt;/tt&amp;gt; to the following configuration set ups in case you installed slang in a different path other than &amp;lt;tt&amp;gt;/usr/local/&amp;lt;/tt&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Install jed. Go to the jed directory&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
make xjed&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Install slgsl. Go to the slgsl directory&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Install slxfig. Go to the slxfig directory&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Install slirp. Go to the slirp directory&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh ./configure&lt;br /&gt;
make&lt;br /&gt;
sudo make install &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Before building the isisscripts, install the &amp;quot;SLURP&amp;quot; perl module. &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sudo perl -MCPAN -e shell&lt;br /&gt;
install CPAN&lt;br /&gt;
reload cpan&lt;br /&gt;
install File::Slurp&lt;br /&gt;
quit&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then go to the isisscripts directory and&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
make&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Create / modify the [[ISIS auto start (.isisrc)|.isisrc]] file in your &amp;lt;tt&amp;gt;$HOME&amp;lt;/tt&amp;gt; directory and add the following line&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
add_to_isis_load_path(&amp;quot;/Path/To/isisscripts/share/&amp;quot;);&lt;br /&gt;
define ris() { require(&amp;quot;isisscripts&amp;quot;); };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
This allows the easy access to &amp;lt;tt&amp;gt;require(&amp;quot;isisscripts&amp;quot;)&amp;lt;/tt&amp;gt; in isis.&lt;br /&gt;
&lt;br /&gt;
== 4. Install stellar_isisscripts ==&lt;br /&gt;
&lt;br /&gt;
Get the newest version of the stellar_isisscripts from git.&lt;br /&gt;
If you have a Remeis/gitlab account and are at the Remeis observatory, do:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
git clone git@serpens.sternwarte.uni-erlangen.de:irrgang/stellar.git&lt;br /&gt;
mv stellar stellar_isisscripts&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
If you have a Remeis/gitlab account and are not at the Remeis observatory, do:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
git clone http://www.sternwarte.uni-erlangen.de/gitlab/irrgang/stellar.git&lt;br /&gt;
mv stellar stellar_isisscripts&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Otherwise ask Matti Dorsch. Then:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd stellar_isisscripts&lt;br /&gt;
make &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now compile some necessary C functions:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd stellar_isisscripts/slirp&lt;br /&gt;
slirp -make -lm -lgsl -lgslcblas -lpthread c_functions.h c_functions.o&lt;br /&gt;
sed -i -e 's/^CFLAGS[[:space:]]*= -g -O2/CFLAGS = -g -Ofast -Wall -Wextra/g' Makefile &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
make&lt;br /&gt;
make test&lt;br /&gt;
rm c_functions-test.sl ; rm c_functions.o ; rm c_functions_glue.o ; rm c_functions_glue.c ; rm Makefile ; rm *~ &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Create modify the [[ISIS auto start (.isisrc)|.isisrc]] file in your &amp;lt;tt&amp;gt;$HOME&amp;lt;/tt&amp;gt; directory and add the following line&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
add_to_isis_load_path(&amp;quot;/Path/To/stellar_isisscripts/share/&amp;quot;);&lt;br /&gt;
define rmy() { require(&amp;quot;stellar_isisscripts&amp;quot;); };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To use all functions, you need the stilts tool, which is related to TOPCAT. You can install it as a package:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sudo apt-get install stilts&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Only if this does not work:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mkdir stilts; cd stilts&lt;br /&gt;
wget http://www.star.bris.ac.uk/~mbt/stilts/stilts.jar&lt;br /&gt;
wget 'http://www.star.bris.ac.uk/~mbt/stilts/stilts'&lt;br /&gt;
chmod +x stilts&lt;br /&gt;
sudo cp * /usr/bin/&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Without root, you can add the folder that contains stilts to your $PATH in ~/.bashrc or ~/.cshrc.&lt;br /&gt;
&lt;br /&gt;
To test if the scripts work, try for example:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
isis&lt;br /&gt;
rmy;&lt;br /&gt;
s = query_photometry (&amp;quot;HZ44&amp;quot;);&lt;br /&gt;
s.print();&lt;br /&gt;
help query_photometry&lt;br /&gt;
q&lt;br /&gt;
exit;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2509</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2509"/>
		<updated>2022-05-19T09:38:11Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/home/X-ray/Fermi/archive/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/home/X-ray/Fermi/archive/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /home/X-ray/Fermi/archive/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/home/X-ray/Fermi/archive/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/home/X-ray/Fermi/archive/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. I recommend to write a script to create the .slurm file that you want to submit to slurm. Here's an example how this array job can look like:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --job-name LATspectra&lt;br /&gt;
#SBATCH --output job.out-%a&lt;br /&gt;
#SBATCH --error err.out-%a&lt;br /&gt;
#SBATCH --time 48:00:00&lt;br /&gt;
#SBATCH --partition=remeis&lt;br /&gt;
#SBATCH --ntasks=1&lt;br /&gt;
#SBATCH --hint=nomultithread&lt;br /&gt;
#SBATCH --mem-per-cpu=4G&lt;br /&gt;
#SBATCH --mail-type=END,FAIL,ARRAY_TASKS&lt;br /&gt;
#SBATCH --mail-user=my@email.de&lt;br /&gt;
#SBATCH --array 0-XXX%YY (XXX: number of tasks in array, YY: number of jobs running simultaneously&lt;br /&gt;
&lt;br /&gt;
COMMAND[0]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJXXX.X-XXXX mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[1]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJYYY.Y-YYYY mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[2]=&amp;quot;...&amp;quot;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Can the scripts find possibly unknown gamma-ray sources by itself?''' - Yes, when running the analysis script to create a spectrum (with make_spectrum.sl), there is a routine implemented (fermipy function ''find_sources''), which searches for excesses in the TS map with a significance of 5 sigma or higher. The coordinates of this found source is then saved in the folder, in which the analysis is run, to allow further inspections and a follow-up analyis.&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code, my work-around for this issue is implemented in the wrapper scripts ''make_spectrum.sl'' and ''make_lightcurve.sl''. Deleting the subfolders is now handled in there and not via the routine in the source code. &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
If you encounter a new error, you can also go ahead and start the python script for the analysis directly in the shell. Remember to load your conda environment first. You also need a ''config_SED.yaml'' or ''config_LC.yaml'' file in the folder where you are going to run the analysis, otherwise you won't get far. You can directly run the script, e.g. via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(fermipy)[user@computer]&amp;gt; $FERMITOOLS/SEDcreator_fermipy.py&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: September 27, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2508</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2508"/>
		<updated>2022-05-19T09:37:13Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/home/X-ray/Fermi/archive/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/home/X-ray/Fermi/archive//spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /home/X-ray/Fermi/archive/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/home/X-ray/Fermi/archive/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/home/X-ray/Fermi/archive/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. I recommend to write a script to create the .slurm file that you want to submit to slurm. Here's an example how this array job can look like:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --job-name LATspectra&lt;br /&gt;
#SBATCH --output job.out-%a&lt;br /&gt;
#SBATCH --error err.out-%a&lt;br /&gt;
#SBATCH --time 48:00:00&lt;br /&gt;
#SBATCH --partition=remeis&lt;br /&gt;
#SBATCH --ntasks=1&lt;br /&gt;
#SBATCH --hint=nomultithread&lt;br /&gt;
#SBATCH --mem-per-cpu=4G&lt;br /&gt;
#SBATCH --mail-type=END,FAIL,ARRAY_TASKS&lt;br /&gt;
#SBATCH --mail-user=my@email.de&lt;br /&gt;
#SBATCH --array 0-XXX%YY (XXX: number of tasks in array, YY: number of jobs running simultaneously&lt;br /&gt;
&lt;br /&gt;
COMMAND[0]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJXXX.X-XXXX mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[1]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJYYY.Y-YYYY mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[2]=&amp;quot;...&amp;quot;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Can the scripts find possibly unknown gamma-ray sources by itself?''' - Yes, when running the analysis script to create a spectrum (with make_spectrum.sl), there is a routine implemented (fermipy function ''find_sources''), which searches for excesses in the TS map with a significance of 5 sigma or higher. The coordinates of this found source is then saved in the folder, in which the analysis is run, to allow further inspections and a follow-up analyis.&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code, my work-around for this issue is implemented in the wrapper scripts ''make_spectrum.sl'' and ''make_lightcurve.sl''. Deleting the subfolders is now handled in there and not via the routine in the source code. &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
If you encounter a new error, you can also go ahead and start the python script for the analysis directly in the shell. Remember to load your conda environment first. You also need a ''config_SED.yaml'' or ''config_LC.yaml'' file in the folder where you are going to run the analysis, otherwise you won't get far. You can directly run the script, e.g. via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(fermipy)[user@computer]&amp;gt; $FERMITOOLS/SEDcreator_fermipy.py&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: September 27, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2241</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2241"/>
		<updated>2021-09-27T16:52:04Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. I recommend to write a script to create the .slurm file that you want to submit to slurm. Here's an example how this array job can look like:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --job-name LATspectra&lt;br /&gt;
#SBATCH --output job.out-%a&lt;br /&gt;
#SBATCH --error err.out-%a&lt;br /&gt;
#SBATCH --time 48:00:00&lt;br /&gt;
#SBATCH --partition=remeis&lt;br /&gt;
#SBATCH --ntasks=1&lt;br /&gt;
#SBATCH --hint=nomultithread&lt;br /&gt;
#SBATCH --mem-per-cpu=4G&lt;br /&gt;
#SBATCH --mail-type=END,FAIL,ARRAY_TASKS&lt;br /&gt;
#SBATCH --mail-user=my@email.de&lt;br /&gt;
#SBATCH --array 0-XXX%YY (XXX: number of tasks in array, YY: number of jobs running simultaneously&lt;br /&gt;
&lt;br /&gt;
COMMAND[0]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJXXX.X-XXXX mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[1]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJYYY.Y-YYYY mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[2]=&amp;quot;...&amp;quot;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Can the scripts find possibly unknown gamma-ray sources by itself?''' - Yes, when running the analysis script to create a spectrum (with make_spectrum.sl), there is a routine implemented (fermipy function ''find_sources''), which searches for excesses in the TS map with a significance of 5 sigma or higher. The coordinates of this found source is then saved in the folder, in which the analysis is run, to allow further inspections and a follow-up analyis.&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code, my work-around for this issue is implemented in the wrapper scripts ''make_spectrum.sl'' and ''make_lightcurve.sl''. Deleting the subfolders is now handled in there and not via the routine in the source code. &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
If you encounter a new error, you can also go ahead and start the python script for the analysis directly in the shell. Remember to load your conda environment first. You also need a ''config_SED.yaml'' or ''config_LC.yaml'' file in the folder where you are going to run the analysis, otherwise you won't get far. You can directly run the script, e.g. via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(fermipy)[user@computer]&amp;gt; $FERMITOOLS/SEDcreator_fermipy.py&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: September 27, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2240</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2240"/>
		<updated>2021-09-27T16:51:35Z</updated>

		<summary type="html">&lt;p&gt;Gokus: /* Q&amp;amp;A */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. I recommend to write a script to create the .slurm file that you want to submit to slurm. Here's an example how this array job can look like:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --job-name LATspectra&lt;br /&gt;
#SBATCH --output job.out-%a&lt;br /&gt;
#SBATCH --error err.out-%a&lt;br /&gt;
#SBATCH --time 48:00:00&lt;br /&gt;
#SBATCH --partition=remeis&lt;br /&gt;
#SBATCH --ntasks=1&lt;br /&gt;
#SBATCH --hint=nomultithread&lt;br /&gt;
#SBATCH --mem-per-cpu=4G&lt;br /&gt;
#SBATCH --mail-type=END,FAIL,ARRAY_TASKS&lt;br /&gt;
#SBATCH --mail-user=my@email.de&lt;br /&gt;
#SBATCH --array 0-XXX%YY (XXX: number of tasks in array, YY: number of jobs running simultaneously&lt;br /&gt;
&lt;br /&gt;
COMMAND[0]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJXXX.X-XXXX mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[1]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJYYY.Y-YYYY mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[2]=&amp;quot;...&amp;quot;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Can the scripts find possibly unknown gamma-ray sources by itself?''' - Yes, when running the analysis script to create a spectrum (with make_spectrum.sl), there is a routine implemented (fermipy function ''find_sources''), which searches for excesses in the TS map with a significance of 5 sigma or higher. The coordinates of this found source is then saved in the folder, in which the analysis is run, to allow further inspections and a follow-up analyis.&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code, my work-around for this issue is implemented in the wrapper scripts ''make_spectrum.sl'' and ''make_lightcurve.sl''. Deleting the subfolders is now handled in there and not via the routine in the source code. &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
If you encounter a new error, you can also go ahead and start the python script for the analysis directly in the shell. Remember to load your conda environment first. You also need a ''config_SED.yaml'' or ''config_LC.yaml'' file in the folder where you are going to run the analysis, otherwise you won't get far. You can directly run the script, e.g. via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(fermipy)[user@computer]&amp;gt; $FERMITOOLS/SEDcreator_fermipy.py&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: June 23, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2228</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2228"/>
		<updated>2021-06-29T14:41:16Z</updated>

		<summary type="html">&lt;p&gt;Gokus: /* Troubleshooting */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. I recommend to write a script to create the .slurm file that you want to submit to slurm. Here's an example how this array job can look like:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --job-name LATspectra&lt;br /&gt;
#SBATCH --output job.out-%a&lt;br /&gt;
#SBATCH --error err.out-%a&lt;br /&gt;
#SBATCH --time 48:00:00&lt;br /&gt;
#SBATCH --partition=remeis&lt;br /&gt;
#SBATCH --ntasks=1&lt;br /&gt;
#SBATCH --hint=nomultithread&lt;br /&gt;
#SBATCH --mem-per-cpu=4G&lt;br /&gt;
#SBATCH --mail-type=END,FAIL,ARRAY_TASKS&lt;br /&gt;
#SBATCH --mail-user=my@email.de&lt;br /&gt;
#SBATCH --array 0-XXX%YY (XXX: number of tasks in array, YY: number of jobs running simultaneously&lt;br /&gt;
&lt;br /&gt;
COMMAND[0]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJXXX.X-XXXX mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[1]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJYYY.Y-YYYY mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[2]=&amp;quot;...&amp;quot;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code, my work-around for this issue is implemented in the wrapper scripts ''make_spectrum.sl'' and ''make_lightcurve.sl''. Deleting the subfolders is now handled in there and not via the routine in the source code. &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
If you encounter a new error, you can also go ahead and start the python script for the analysis directly in the shell. Remember to load your conda environment first. You also need a ''config_SED.yaml'' or ''config_LC.yaml'' file in the folder where you are going to run the analysis, otherwise you won't get far. You can directly run the script, e.g. via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(fermipy)[user@computer]&amp;gt; $FERMITOOLS/SEDcreator_fermipy.py&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: June 23, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2225</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2225"/>
		<updated>2021-06-23T14:54:17Z</updated>

		<summary type="html">&lt;p&gt;Gokus: /* Contact person */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. I recommend to write a script to create the .slurm file that you want to submit to slurm. Here's an example how this array job can look like:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --job-name LATspectra&lt;br /&gt;
#SBATCH --output job.out-%a&lt;br /&gt;
#SBATCH --error err.out-%a&lt;br /&gt;
#SBATCH --time 48:00:00&lt;br /&gt;
#SBATCH --partition=remeis&lt;br /&gt;
#SBATCH --ntasks=1&lt;br /&gt;
#SBATCH --hint=nomultithread&lt;br /&gt;
#SBATCH --mem-per-cpu=4G&lt;br /&gt;
#SBATCH --mail-type=END,FAIL,ARRAY_TASKS&lt;br /&gt;
#SBATCH --mail-user=my@email.de&lt;br /&gt;
#SBATCH --array 0-XXX%YY (XXX: number of tasks in array, YY: number of jobs running simultaneously&lt;br /&gt;
&lt;br /&gt;
COMMAND[0]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJXXX.X-XXXX mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[1]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJYYY.Y-YYYY mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[2]=&amp;quot;...&amp;quot;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code, my work-around for this issue is implemented in the wrapper scripts ''make_spectrum.sl'' and ''make_lightcurve.sl''. Deleting the subfolders is now handled in there and not via the routine in the source code. &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&amp;lt;/span&amp;gt;&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
If you encounter a new error, you can also go ahead and start the python script for the analysis directly in the shell. Remember to load your conda environment first. You also need a ''config_SED.yaml'' or ''config_LC.yaml'' file in the folder where you are going to run the analysis, otherwise you won't get far. You can directly run the script, e.g. via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(fermipy)[user@computer]&amp;gt; $FERMITOOLS/SEDcreator_fermipy.py&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: June 23, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2224</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2224"/>
		<updated>2021-06-23T14:53:59Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. I recommend to write a script to create the .slurm file that you want to submit to slurm. Here's an example how this array job can look like:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
#SBATCH --job-name LATspectra&lt;br /&gt;
#SBATCH --output job.out-%a&lt;br /&gt;
#SBATCH --error err.out-%a&lt;br /&gt;
#SBATCH --time 48:00:00&lt;br /&gt;
#SBATCH --partition=remeis&lt;br /&gt;
#SBATCH --ntasks=1&lt;br /&gt;
#SBATCH --hint=nomultithread&lt;br /&gt;
#SBATCH --mem-per-cpu=4G&lt;br /&gt;
#SBATCH --mail-type=END,FAIL,ARRAY_TASKS&lt;br /&gt;
#SBATCH --mail-user=my@email.de&lt;br /&gt;
#SBATCH --array 0-XXX%YY (XXX: number of tasks in array, YY: number of jobs running simultaneously&lt;br /&gt;
&lt;br /&gt;
COMMAND[0]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJXXX.X-XXXX mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[1]=&amp;quot;/software/Science/satscripts/fermiscripts/make_spectrum.sl --expath=/your_path/ 4FGLJYYY.Y-YYYY mjd_start mjd_stop&amp;quot;&lt;br /&gt;
COMMAND[2]=&amp;quot;...&amp;quot;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code, my work-around for this issue is implemented in the wrapper scripts ''make_spectrum.sl'' and ''make_lightcurve.sl''. Deleting the subfolders is now handled in there and not via the routine in the source code. &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&amp;lt;/span&amp;gt;&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
If you encounter a new error, you can also go ahead and start the python script for the analysis directly in the shell. Remember to load your conda environment first. You also need a ''config_SED.yaml'' or ''config_LC.yaml'' file in the folder where you are going to run the analysis, otherwise you won't get far. You can directly run the script, e.g. via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(fermipy)[user@computer]&amp;gt; $FERMITOOLS/SEDcreator_fermipy.py&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2223</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2223"/>
		<updated>2021-06-23T14:40:01Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code, my work-around for this issue is implemented in the wrapper scripts ''make_spectrum.sl'' and ''make_lightcurve.sl''. Deleting the subfolders is now handled in there and not via the routine in the source code. &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&amp;lt;/span&amp;gt;&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
If you encounter a new error, you can also go ahead and start the python script for the analysis directly in the shell. Remember to load your conda environment first. You also need a ''config_SED.yaml'' or ''config_LC.yaml'' file in the folder where you are going to run the analysis, otherwise you won't get far. You can directly run the script, e.g. via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
(fermipy)[user@computer]&amp;gt; $FERMITOOLS/SEDcreator_fermipy.py&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2222</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2222"/>
		<updated>2021-06-23T14:15:10Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code, my work-around for this issue is implemented in the wrapper scripts ''make_spectrum.sl'' and ''make_lightcurve.sl''. Deleting the subfolders is now handled in there and not via the routine in the source code. &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&amp;lt;/span&amp;gt;&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2221</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2221"/>
		<updated>2021-06-23T14:12:57Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, that one of the LC bins folder was missing, and hence, the light curve could not be assembled into one fits file. Because I couldn't pinpoint the error in the source code&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&amp;lt;/span&amp;gt;&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2220</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2220"/>
		<updated>2021-06-23T13:02:48Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
The internal sub-routine for creating the LCs throwed me an error a couple of times, &lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&amp;lt;/span&amp;gt;&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2219</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2219"/>
		<updated>2021-06-23T10:36:11Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix (additions marked with &amp;amp;rArr;):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
          &amp;amp;rArr; flux_const_for_calcTS_var = None&amp;lt;/span&amp;gt;&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if not 'fit_success' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; self.logger.error(&lt;br /&gt;
                  &amp;amp;rArr; 'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                  &amp;amp;rArr; continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; [el]if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
              &amp;amp;rArr; if 'flux_const' in mapo[i]:&lt;br /&gt;
                  &amp;amp;rArr; flux_const_for_calcTS_var = mapo[i]['flux_const']&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                   &amp;amp;rArr; flux_const=flux_const_for_calcTS_var,&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2218</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2218"/>
		<updated>2021-06-23T10:26:34Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&amp;lt;/pre&amp;gt;&lt;br /&gt;
        &amp;lt;span style=&amp;quot;color:red;&amp;quot;&amp;gt;flux_const_for_calcTS_var = None&amp;lt;/span&amp;gt;&lt;br /&gt;
        &amp;lt;pre&amp;gt;for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
            '''if not 'fit_success' in mapo[i]:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
            el'''if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                 flux_const=mapo[0]['flux_const'],&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2217</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2217"/>
		<updated>2021-06-23T10:18:55Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
        &amp;lt;span style=&amp;quot;color:red&amp;quot;&amp;gt;flux_const_for_calcTS_var = None&amp;lt;/span&amp;gt;&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;span style=&amp;quot;color:red&amp;quot;&amp;gt;if not 'fit_success' in mapo[i]:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
            el&amp;lt;/span&amp;gt;if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                 flux_const=mapo[0]['flux_const'],&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
{{font color|red|This text is different}}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2216</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2216"/>
		<updated>2021-06-23T10:17:50Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success'', here is my hot fix:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# this snippet is nearly at the bottom of the file&lt;br /&gt;
&lt;br /&gt;
        itimes = enumerate(zip(times[:-1], times[1:]))&lt;br /&gt;
        &amp;lt;span style=&amp;quot;color:red&amp;quot;&amp;gt;flux_const_for_calcTS_var = None&amp;lt;/span&amp;gt;&lt;br /&gt;
        for i, time in itimes:&lt;br /&gt;
&lt;br /&gt;
            &amp;lt;span style=&amp;quot;color:red&amp;quot;&amp;gt;if not 'fit_success' in mapo[i]:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                'fit_success not found in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
            el&amp;lt;/span&amp;gt;if not mapo[i]['fit_success']:&lt;br /&gt;
                self.logger.error(&lt;br /&gt;
                    'Fit failed in bin %d in range %i %i.' % (i, time[0], time[1]))&lt;br /&gt;
                continue&lt;br /&gt;
&lt;br /&gt;
            for k in o.keys():&lt;br /&gt;
&lt;br /&gt;
                if k == 'config':&lt;br /&gt;
                    continue&lt;br /&gt;
                if not k in mapo[i]:&lt;br /&gt;
                    continue&lt;br /&gt;
                # if (isinstance(o[k], np.ndarray) and&lt;br /&gt;
                #    o[k][i].shape != mapo[i][k].shape):&lt;br /&gt;
                #    gta.logger.warning('Incompatible shape for column %s', k)&lt;br /&gt;
                #    continue&lt;br /&gt;
&lt;br /&gt;
                try:&lt;br /&gt;
                    o[k][i] = mapo[i][k]&lt;br /&gt;
                except:&lt;br /&gt;
                    pass&lt;br /&gt;
&lt;br /&gt;
        systematic = kwargs.get('systematic', 0.02)&lt;br /&gt;
&lt;br /&gt;
        o['ts_var'] = calcTS_var(loglike=o['loglike_fixed'],&lt;br /&gt;
                                 loglike_const=o['loglike_const'],&lt;br /&gt;
                                 flux_err=o['flux_err_fixed'],&lt;br /&gt;
                                 flux_const=mapo[0]['flux_const'],&lt;br /&gt;
                                 systematic=systematic,&lt;br /&gt;
                                 fit_success=o['fit_success_fixed'])&lt;br /&gt;
&lt;br /&gt;
        return o&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2215</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2215"/>
		<updated>2021-06-23T10:05:24Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [[https://github.com/fermiPy/fermipy/issues/405|reported this issue to the developers]], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success''&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2214</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2214"/>
		<updated>2021-06-23T10:03:54Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Individual bins crashed and then the light curve cannot be assembled''' &amp;lt;br/&amp;gt;&lt;br /&gt;
This is a problem that I fixed manually by editing the source code in the lightcurve.py file. While I have also [reported this issue to the developers|https://github.com/fermiPy/fermipy/issues/405], it has been currently not fixed yet (June 23 2021). In order to still manage creating the light curve, I implemented the following hot fix. In case you are setting up your own conda environment and you come across an error related to ''KeyError: fit_success''&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2213</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2213"/>
		<updated>2021-06-23T09:14:40Z</updated>

		<summary type="html">&lt;p&gt;Gokus: /* Troubleshooting */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''problems when Fit of individual LC bins didn't even start''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files. An error which can relate to that, looks, for example, like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GTBinnedAnalysis.run_gtapp(): Caught N3tip12TipExceptionE at the top level: setNumRecords could not insert rows in FITS table in extension &amp;quot;EVENTS&amp;quot; in file &amp;quot;/path_to_analysis/LC_analysis/ft1_00.fits&amp;quot; (CFITSIO ERROR 108: error reading from FITS file)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In case you need to compute a lot of spectra of light curves and want to do this on /karl, either limit the number of jobs running simultaneously to 10, or store the data on /userdata, where I have so far never encountered this problem.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2212</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2212"/>
		<updated>2021-06-23T09:09:26Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - In order to do that, you need to work from within a bash shell, not a (t)csh shell. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
During the Fermi data analysis, several errors can pop up and are not necessarily linked to the Fermi analysis itself. Here, I'll show those I encountered and what I have done to (hot)fix them.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''problems when Fit of individual LC bins didn't even start''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
While /karl has the benefit of storing large amounts of data, there is an issue with computing many Fermi spectra or light curve at the same time and storing that on /karl. The problems that arise because of this are not clearly visible from checking the log files, but a common error looks, for example like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
dkdkdkdkd&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
Computing a long light curve with only a short binning (e.g. daily binning for a year-long light curve, or weekly binning for a light curve covering a few years), can result in a termination of the analysis process when the computer cannot handle the amount of open files anymore. I'm not entirely sure what goes wrong internally, however, my suggested hot fix for you is to split the computation of the light curve into manageable time intervals (e.g. chunks of ~100 bins). After all chunks have been computed, you can put them together using the command ''ftmerge'':&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
&lt;br /&gt;
# Step 1: Create list of all light curve chunk files&lt;br /&gt;
ls -1 /path_to_my_analysis_results/source_name/LC_analysis/*_lightcurve.fits &amp;gt; /path_to_my_analysis_results/source_name.list&lt;br /&gt;
&lt;br /&gt;
# /path_to_my_analysis_results/ should contain the folders of each chunk, e.g. named MJD_start_end&lt;br /&gt;
&lt;br /&gt;
# Step 2: Merge all light curve files into one fits file&lt;br /&gt;
ftmerge @/path_to_my_analysis_results/source_name.list /path_to_my_analysis_results/lightcurve_source_name.fits clobber = yes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
No worries about losing already computed bins in case your analysis crashes: those are not deleted when the script crashes mid-way, therefore already computed LC bins are available after the crash and it's not necessary to re-do the analysis for all of the light curve bins.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2211</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2211"/>
		<updated>2021-06-23T08:40:17Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - ...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
Errors I have encountered and (hot)fixed: &amp;lt;br/&amp;gt;&lt;br /&gt;
'''error during deletion of LC subfolders''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''problems when Fit of individual LC bins didn't even start''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Using /karl'''&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Running the original script to check what's going wrong''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br/&amp;gt;&lt;br /&gt;
'''Light curve fails too many open files''' &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2208</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2208"/>
		<updated>2021-05-15T21:20:11Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While most options are the same as for the ''make_spectrum.sl'' and ''make_lightcurve.sl'' scripts, there are additional options regarding your setup on slurm. Especially the memory and walltime options need to be increased in case you are want to compute a lightcurve with more than 50 bins.&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - ...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
Errors I have encountered and (hot)fixed: &amp;lt;br/&amp;gt;&lt;br /&gt;
- error during deletion of LC subfolders &amp;lt;br/&amp;gt;&lt;br /&gt;
- problems when Fit of individual LC bins didn't even start &amp;lt;br/&amp;gt;&lt;br /&gt;
- using /karl &amp;lt;br/&amp;gt;&lt;br /&gt;
- how to run the original script to check what's going wrong? &amp;lt;br/&amp;gt;&lt;br /&gt;
- light curve fails too many open files &amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2207</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2207"/>
		<updated>2021-05-15T21:16:33Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&amp;lt;br/&amp;gt;&lt;br /&gt;
If you have comments or questions, please find the information on the Fermi contact person at the Remeis observatory at the bottom of the page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you want to compute many bins for your light curve). Ideally you want to let the cluster perform your jobs for you and not run an analysis on your desktop machine, unless you want to test something. In order to make life easy for a job submission on slurm, you can use the script ''submit_fermijob.sl'', which is also available at the $FERMITOOLS location. The usage of ''submit_fermijob.sl'' should look like&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl ANALYSIS SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
In addition to the name and the time ranges for the analysis, it is necessary to specify whether a spectral (SED) or light curve (LC) computation is requested.&lt;br /&gt;
Calling the help on this script via '''$FERMITOOLS/submit_fermijob.sl --help''', gives a full overview of all options available here:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/submit_fermijob.sl  --help&lt;br /&gt;
&lt;br /&gt;
** Script to submit a Fermi Analysis to slurm **&lt;br /&gt;
  This script creates a file to submit a Fermi LC or SED analysis to the slurm queue&lt;br /&gt;
  [Make sure you submit the job when the Fermi Environment is already loaded!]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Usage: submit_fermijob.sl [options] [analysis] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    analysis:	specify whether it's a light curve [LC] or a SED analysis [SED]&lt;br /&gt;
    src:    Name of source as string, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range the analysis&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known = yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, the coordinates&lt;br /&gt;
			from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	Number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin, only relevant for LC analysis! Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --submit=yes/no	State whether you want to submit to slurm right away; Default: YES&lt;br /&gt;
&lt;br /&gt;
   --walltime=hours	Walltime in hours, note that the default walltime might not be enough time for a longer lightcurve.&lt;br /&gt;
			If you're running into a Slurm TIMEOUT error, set a longer walltime and resubmit. Previous steps of the analysis are saved&lt;br /&gt;
			and don't need to be repeated unless something in the configuration setup has to be changed; Default: 10 hours&lt;br /&gt;
&lt;br /&gt;
   --mem=memory		Memory per CPU, set only number in unit of Gigabyte; Default: 3GB&lt;br /&gt;
&lt;br /&gt;
   --email=mailadress	If you want to get informed about the slurm status per email, put email here.&lt;br /&gt;
&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
'''Is multi-threading available for the pipeline scripts?''' - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&amp;lt;br/&amp;gt;&lt;br /&gt;
'''How can I submit an array job?''' - ...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
Errors I have encountered and (hot)fixed:&lt;br /&gt;
- error during deletion of LC subfolders&lt;br /&gt;
- problems when Fit of individual LC bins didn't even start&lt;br /&gt;
- using /karl&lt;br /&gt;
- how to run the original script to check what's going wrong?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2206</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2206"/>
		<updated>2021-05-15T21:07:19Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
The computation of LAT products can take several hours up to days (if you have &lt;br /&gt;
&lt;br /&gt;
== Q&amp;amp;A ==&lt;br /&gt;
Is multi-threading available for the pipeline scripts? - No, this is not possible to use via slurm at the moment. If you're running an analysis on your machine with your own scripts, it is possible to activate multi-threading for the computation of a light curve. For more information on this, please check out https://fermipy.readthedocs.io/en/latest/advanced/lightcurve.html#optimizing-computation-speed.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
Errors&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2205</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2205"/>
		<updated>2021-05-15T21:02:48Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs three arguments: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
The computation of a light curve is similarly called ''make_lightcurve.sl'' and is available via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The script needs at the same arguments that are needed for the spectral analysis: the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. Again, several options are provided for specific needs for the analysis, which you can call via '''$FERMITOOLS/make_lightcurve.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_lightcurve.sl --help&lt;br /&gt;
&lt;br /&gt;
** General script to compute Fermi lightcurves **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_lightcurve.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, end of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees	RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given, &lt;br /&gt;
			the coordinates from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number    number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
   --timebin=number	Length of one light curve bin. Default: 7 [days] (weekly binning)&lt;br /&gt;
   --savebins=True/False	Save all the light curve bins? Careful, can take up a lot of space! Default: False&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Please note that the default time binning for the light curve is one week (7 days).&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2204</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2204"/>
		<updated>2021-05-15T20:49:55Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs at least three arguments, the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, it is assumed that you do the analysis on a known gamma-ray source, which is named in the 4FGL. If that is the case, it is easiest to give the 4FGL name (without blanks!) as first arguments to the script. If your source is not in the 4FGL, you need to add the coordinates via the --ra and --dec options.&amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2203</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2203"/>
		<updated>2021-05-15T20:46:51Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
If you are interested in the gamma-ray spectrum of a source, the corresponding script is called ''make_spectrum.sl'' and can be called via&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl SOURCENAME TMIN TMAX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
As you can see, the script needs at least three arguments, the name of the source, and both a start time and an end time, for which an average gamma-ray spectrum should be computed. There are several options you can choose from regarding the specifics of the analysis, which you can call via '''$FERMITOOLS/make_spectrum.sl --help'''.&lt;br /&gt;
The full help output looks like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$FERMITOOLS/make_spectrum.sl --help&lt;br /&gt;
** General script to compute Fermi spectra **&lt;br /&gt;
  [Make sure you have loaded the Fermitools and Fermipy (see README file/wiki entry)]&lt;br /&gt;
&lt;br /&gt;
Usage: make_spectrum.sl [options] [src] [tmin] [tmax]&lt;br /&gt;
Mandatory inputs:&lt;br /&gt;
    src:    Name of source without any blanks, either a 4FGL name or a different name if corresponding catalog is defined in options&lt;br /&gt;
    tmin:   in MJD, start of time range for which SED will be computed&lt;br /&gt;
    tmax:   in MJD, nd of time range&lt;br /&gt;
&lt;br /&gt;
Options for analysis:&lt;br /&gt;
   --known=yes/no	Is source a known Fermi source (in the 4FGL)? Important to set correctly, otherwise&lt;br /&gt;
			the analysis will fail! Default: YES&lt;br /&gt;
&lt;br /&gt;
   --ra=degrees		RAJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --dec=degrees	DEJ2000 position of source, only needed if source is NOT in 4FGL and no other&lt;br /&gt;
			alternative catalog is provided with the '--cat option'!&lt;br /&gt;
&lt;br /&gt;
   --cat=catalog.fits	Catalog to query for source coordinates, must have columns named Source_Name,&lt;br /&gt;
			RAJ2000, and DEJ2000 for name and coordinates. If '--ra' and '--dec' are also given,&lt;br /&gt;
			the coordinates	from the catalog will be ignored. Default: 4FGL&lt;br /&gt;
&lt;br /&gt;
   --roi=degrees	Size of ROI in degrees. Default: 10 [deg]&lt;br /&gt;
&lt;br /&gt;
   --emin=energy	Minimum energy in MeV. Default: 100 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --emax=energy	Maximum energy in MeV. Default: 300000 [MeV]&lt;br /&gt;
&lt;br /&gt;
   --ebin=number	number of energy bins per energy decade. Default: 10&lt;br /&gt;
&lt;br /&gt;
Options for setup:&lt;br /&gt;
   --expath=path	Path where analysis should be saved. Default: location where script is started&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2202</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2202"/>
		<updated>2021-05-15T20:41:38Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV. The scripts have only been tested for blazar analyses.&amp;lt;br/&amp;gt;&lt;br /&gt;
The scripts can &lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2201</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2201"/>
		<updated>2021-05-15T20:40:03Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, and supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Fermi-LAT analysis with the pipeline scripts ==&lt;br /&gt;
The scripts available at $FERMITOOLS are written to perform a standard analysis on a point source, i.e. a blazar. Please note that these analysis scripts might not provide correct results if you are analysing energies below 100 MeV&lt;br /&gt;
&lt;br /&gt;
=== Creating a spectrum ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Creating a light curve ===&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2200</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2200"/>
		<updated>2021-05-15T20:36:28Z</updated>

		<summary type="html">&lt;p&gt;Gokus: /* The LAT data */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, and supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''. Those are provided and regularly updated by the Fermi-LAT collaboration and help to improve your analysis results, as background, especially at lower energies (&amp;lt; 1 GeV), can have quite an impact on your results.&lt;br /&gt;
&lt;br /&gt;
== Creating a spectrum ==&lt;br /&gt;
&lt;br /&gt;
== Creating a light curve ==&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2199</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2199"/>
		<updated>2021-05-15T20:34:22Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are mostly present in the galactic plane and the magellanic clouds (e.g. features of the LMC, the Centaurus A lobes, and supernova remnants), can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the $FERMI_DIR, you can find the current catalog as well as the templates for all extended sources.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In addition, the LAT background models, needed to model the galactic and isotropic diffuse emission, can be found at '''/satdata/X-ray/Fermi/LAT_background_models'''.&lt;br /&gt;
== Creating a spectrum ==&lt;br /&gt;
&lt;br /&gt;
== Creating a light curve ==&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2198</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2198"/>
		<updated>2021-05-15T20:30:18Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh -f&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
In case you're using a bash shell, the addition in your .bashrc file should look something like this:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
alias fermipy='conda activate /userdata/data/gokus/conda/miniconda2/envs/fermipy'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Using the command '''fermipy''', you can now change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In the early steps of your analysis, a region of interest (ROI) around your desired source or source coordinates will be created, which contains all the sources that are included in the 10-year Fourth Point Source Catalog (4FGL). In addition, extended sources, which are present in the galactic plane and the magellanic clouds, can be part of that ROI, too. In order to run the pipeline scripts properly, two environment variables need to be added to your .cshrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh -f&lt;br /&gt;
setenv FERMI_DIR /satdata/X-ray/Fermi/catalogs&lt;br /&gt;
setenv LATEXTDIR $FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Alternatively for the bash shell, add the following to your .bashrc file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
export FERMI_DIR=/satdata/X-ray/Fermi/catalogs/&lt;br /&gt;
export LATEXTDIR=$FERMI_DIR/4FGL_extended_sources&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
== Creating a spectrum ==&lt;br /&gt;
&lt;br /&gt;
== Creating a light curve ==&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2197</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2197"/>
		<updated>2021-05-15T20:20:37Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh -f&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
Using the command '''fermipy''', you can change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&amp;lt;br/&amp;gt;&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset. If you are using the provided analysis scripts, it automatically looks through all the data that is available at the observatory.&lt;br /&gt;
All photon files are linked at '''/satdata/X-ray/Fermi/photon_ft1_files.list''', while all of the spacecraft data is merged in '''/satdata/X-ray/Fermi/spacecraft_ft2_files.fits'''.&lt;br /&gt;
== Creating a spectrum ==&lt;br /&gt;
&lt;br /&gt;
== Creating a light curve ==&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2196</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2196"/>
		<updated>2021-05-15T20:15:59Z</updated>

		<summary type="html">&lt;p&gt;Gokus: qui&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh -f&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
Using the command '''fermipy''', you can change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files. One can acquire the data needed for a specific analysis via the online data query at https://fermi.gsfc.nasa.gov/cgi-bin/ssc/LAT/LATDataQuery.cgi, or download weekly data products for the entire sky from https://fermi.gsfc.nasa.gov/ssc/data/access/.&lt;br /&gt;
However, all data acquired by Fermi-LAT during its mission are also stored at the servers of the observatory, hence, it is not necessary to download a specific dataset.&lt;br /&gt;
&lt;br /&gt;
== Creating a spectrum ==&lt;br /&gt;
&lt;br /&gt;
== Creating a light curve ==&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2195</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2195"/>
		<updated>2021-05-15T20:09:48Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh -f&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
Using the command '''fermipy''', you can change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). It is recommended to read the full installation instructions provided in the documentations.&lt;br /&gt;
&lt;br /&gt;
== The LAT data ==&lt;br /&gt;
The data downlinked from the Fermi satellite contains information about the events/photons measured by the LAT, and information about status of the spacecraft itself at each time. Those information are needed for an analysis, as they are the main input. These files are called photon and spacecraft files.&lt;br /&gt;
&lt;br /&gt;
== Creating a spectrum ==&lt;br /&gt;
&lt;br /&gt;
== Creating a light curve ==&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2194</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2194"/>
		<updated>2021-05-15T20:03:29Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;br/&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh -f&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
Using the command '''fermipy''', you can change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure you are installing the right versions of each (as in between versions there was a change in the use of python2.7 and python3). You can follow the &lt;br /&gt;
&lt;br /&gt;
== Creating a spectrum ==&lt;br /&gt;
&lt;br /&gt;
== Creating a light curve ==&lt;br /&gt;
&lt;br /&gt;
== Performing Fermi-LAT analysis using slurm ==&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2193</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2193"/>
		<updated>2021-05-15T20:00:48Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;/br&amp;gt;&lt;br /&gt;
Before you use the script, make sure you know what you're doing. If you are analysing gamma-ray data for the first time, please read the online documentation at https://fermi.gsfc.nasa.gov/ssc/data/analysis/LAT_essentials.html. Because the Fermi-LAT data can be accessed by everyone for free, there is an extensive documentation available online.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh -f&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
Using the command '''fermipy''', you can change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure to be sure&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2192</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2192"/>
		<updated>2021-05-15T19:57:43Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. For more information on the spacecraft, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov. &amp;lt;/br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh -f&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
Using the command '''fermipy''', you can change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure to be sure&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Contact person ==&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. She is also currently the maintainer of the LAT dataset on the Remeis server. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;br/&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2191</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2191"/>
		<updated>2021-05-15T19:51:37Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky. &lt;br /&gt;
For more information, please check out the official Fermi webpage at https://fermi.gsfc.nasa.gov.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written with ''fermipy'', available to create either a spectrum or a light curve. Those are globally available at $FERMITOOLS, which links to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
The pipeline scripts for a standard Fermi-LAT analysis (spectrum or light curve) are optimized to run with the version of this conda environment. In order to envoke it, it is necessary to source the conda installation, where the environment for a Fermi-LAT analysis has been created, before activating the environment. In order to simplify this, you can copy the following in your .cshrc file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/tcsh -f&lt;br /&gt;
source /userdata/data/gokus/conda/miniconda2/etc/profile.d/conda.csh&lt;br /&gt;
alias fermipy &amp;quot;conda activate fermipy&amp;quot; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you already have your own conda installation, which you regularly use, make sure that you create an alias instead of sourcing that conda installtion by default!&lt;br /&gt;
Using the command '''fermipy''', you can change into the conda environment, in which the necessary tools for using the pipeline scripts are installed. Now you're ready to do some Science!&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure to be sure&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Contact person ====&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&amp;lt;/br&amp;gt;&lt;br /&gt;
Last status update on this wiki page: Mai 16, 2021&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2190</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2190"/>
		<updated>2021-05-15T19:31:01Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky.&lt;br /&gt;
&lt;br /&gt;
The following &lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written using fermipy, available to create either a spectrum or a light curve, which are available at $FERMITOOLS, linking to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: calling a currently existing conda environment, or creating the corresponding conda environment yourself. The first version is, at the current moment, the recommended choice, also because the pipeline scripts are written for python 2.7, while for the latter one needs to make sure the correct versions of the Fermitools and ''fermipy'' are installed.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
Please make sure to be sure&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Programmiertext.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
==== Contact person ====&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2189</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2189"/>
		<updated>2021-05-15T19:06:06Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky.&lt;br /&gt;
&lt;br /&gt;
The following &lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written using fermipy, available to create either a spectrum or a light curve, which are available at $FERMITOOLS, linking to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: creating the corresponding conda environment yourself, or calling a currently existing conda environment.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;br/&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;br/&amp;gt;&lt;br /&gt;
''fermipy'': https://github.com/fermiPy/fermipy (Github), https://fermipy.readthedocs.io/en/latest/ (Documentation) &amp;lt;br/&amp;gt;&lt;br /&gt;
The newest version&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Programmiertext.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
==== Contact person ====&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2188</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2188"/>
		<updated>2021-05-15T19:04:40Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky.&lt;br /&gt;
&lt;br /&gt;
The following &lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written using fermipy, available to create either a spectrum or a light curve, which are available at $FERMITOOLS, linking to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: creating the corresponding conda environment yourself, or calling a currently existing conda environment.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Installing and creating a conda environment yourself ===&lt;br /&gt;
Both the Fermitools and ''fermipy'' are managed via GitHub and the full code, documentation and installation can be found at &amp;lt;/br&amp;gt;&lt;br /&gt;
Fermitools: https://github.com/fermi-lat/Fermitools-conda &amp;lt;/br&amp;gt;&lt;br /&gt;
''fermipy'': Github repository: https://github.com/fermiPy/fermipy Documentation: https://fermipy.readthedocs.io/en/latest/ &amp;lt;/br&amp;gt;&lt;br /&gt;
The newest version&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Programmiertext.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
==== Contact person ====&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2187</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2187"/>
		<updated>2021-05-15T18:54:18Z</updated>

		<summary type="html">&lt;p&gt;Gokus: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky.&lt;br /&gt;
&lt;br /&gt;
The following &lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written using fermipy, available to create either a spectrum or a light curve, which are available at $FERMITOOLS, linking to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: creating the corresponding conda environment yourself, or calling a currently existing conda environment.&lt;br /&gt;
&lt;br /&gt;
=== Option 1 - Installing and creating a conda environment yourself ===&lt;br /&gt;
&lt;br /&gt;
=== Option 2 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0) ===&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Programmiertext.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
==== Contact person ====&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
	<entry>
		<id>https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2186</id>
		<title>Fermi-LAT</title>
		<link rel="alternate" type="text/html" href="https://www.sternwarte.uni-erlangen.de/wiki/index.php?title=Fermi-LAT&amp;diff=2186"/>
		<updated>2021-05-15T18:53:53Z</updated>

		<summary type="html">&lt;p&gt;Gokus: Created page with &amp;quot;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray dat...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Fermi Large Area Telescope (LAT) observes the gamma-ray sky from 20 MeV up to 1 TeV. It does so by scanning the entire sky three times a day, hence, there is gamma-ray data available for each day for every position on the sky.&lt;br /&gt;
&lt;br /&gt;
The following &lt;br /&gt;
&lt;br /&gt;
== Installation &amp;amp; preparation ==&lt;br /&gt;
To analyse data from Fermi-LAT, the Fermitools are needed. Additionally, there is a python package, called ''fermipy'', such that one can do the data analysis from within a python environment. There are analysis scripts, written using fermipy, available to create either a spectrum or a light curve, which are available at $FERMITOOLS, linking to /software/Science/satscripts/fermiscripts.&amp;lt;br /&amp;gt;&lt;br /&gt;
To run those scripts, it is necessary to have both the Fermitools and ''fermipy'' installed, ideally in a conda environment. There are two options for the user to do so: creating the corresponding conda environment yourself, or calling a currently existing conda environment.&lt;br /&gt;
&lt;br /&gt;
'''Option 1 - Installing and creating a conda environment yourself'''&lt;br /&gt;
&lt;br /&gt;
===Option 2 - Using the existing conda environment (Fermitools 1.2.23 &amp;amp; fermipy version 0.20.0)===&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Programmiertext.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
==== Contact person ====&lt;br /&gt;
The scripts and this wiki entry has been written by Andrea Gokus. In case of any further questions or problems, please contact her via andrea.gokus[at]fau.de.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Data Extraction]]&lt;/div&gt;</summary>
		<author><name>Gokus</name></author>
	</entry>
</feed>