zap Documentation Release 1.0.dev86 Kurt Soto

Similar documents
Doing a li6le astronomy with. Python. Ryan Cooke (K16) These slides & examples:

FIFI-LS: Basic Cube Analysis using SOSPEX

VIP Documentation. Release Carlos Alberto Gomez Gonzalez, Olivier Wertz & VORTEX team

VERY LARGE TELESCOPE 3D Visualization Tool Cookbook

The Italian LBT spectroscopic data reduction pipeline

ARTIFICIAL INTELLIGENCE AND PYTHON

Week Two. Arrays, packages, and writing programs

Data products. Dario Fadda (USRA) Pipeline team Bill Vacca Melanie Clarke Dario Fadda

POSSUM: analysis for early science and beyond

HANDS ON DATA MINING. By Amit Somech. Workshop in Data-science, March 2016

OpenMSI Arrayed Analysis Toolkit: Analyzing spatially defined samples in mass spectrometry imaging

Tutorial Four: Linear Regression

MATPLOTLIB. Python for computational science November 2012 CINECA.

Matplotlib Python Plotting

Python for Data Analysis

Bi 1x Spring 2014: Plotting and linear regression

Manual of SPCI (structural and physico-chemical interpretation) open-source software version 0.1.5

Python With Data Science

1 Introduction: Download the Sample Code and Data

HESP PIPELINE v. 1.0

Pandas and Friends. Austin Godber Mail: Source:

Introduction to Data Science. Introduction to Data Science with Python. Python Basics: Basic Syntax, Data Structures. Python Concepts (Core)

Part VI. Scientific Computing in Python. Tobias Neckel: Scripting with Bash and Python Compact Max-Planck, February 16-26,

pvextractor Documentation

pyregion Documentation

Python in the Copernicus Climate Change Service

ESO SCIENCE DATA PRODUCTS STANDARD. Doc. No. GEN-SPE-ESO , Issue 5. Addendum. Date: 15/07/2015. Integral Field Spectroscopy: 3D Data Cubes

PS6-DCT-Soln-correction

pandas: Rich Data Analysis Tools for Quant Finance

Spectral Classification

Conda Documentation. Release latest

Certified Data Science with Python Professional VS-1442

Multivariate Calibration Quick Guide

Exercise of data reduction for MOIRCS Multi-Object Spectroscopy

5 File I/O, Plotting with Matplotlib

KNIME Python Integration Installation Guide. KNIME AG, Zurich, Switzerland Version 3.7 (last updated on )

Case Study: Acquiring Image Data from ZTF, and Displaying and Working with Images in Python

Lecture 15: High Dimensional Data Analysis, Numpy Overview

CSC 1315! Data Science

Science Cookbook. Practical Data. open source community experience distilled. Benjamin Bengfort. science projects in R and Python.

NAVIGATING UNIX. Other useful commands, with more extensive documentation, are

Part VI. Scientific Computing in Python. Alfredo Parra : Scripting with Python Compact Max-PlanckMarch 6-10,

Tutorial 2 PHY409 Anadi Canepa Office, TRIUMF MOB 92 B ( )

Introduction to Scientific Computing with Python, part two.

CIS192 Python Programming

MS6021 Scientific Computing. TOPICS: Python BASICS, INTRO to PYTHON for Scientific Computing

Introduction to Python

L3-Python-for-Statistical-Modeling

Imaging and Deconvolution

Lab 4: Structured Programming I

Spectroscopy techniques II. Danny Steeghs

Introduction to Remote Sensing Wednesday, September 27, 2017

PACS Data Reduction Guide: Spectroscopy. Issue user Version 15.0 March 2017

Use case: mapping sparse spatial data with TOPCAT

COSC 490 Computational Topology

Lab on MODIS Cloud spectral properties, Cloud Mask, NDVI and Fire Detection

STREAMING ALGORITHMS. Tamás Budavári / Johns Hopkins University ANALYSIS OF ASTRONOMY IMAGES & CATALOGS 10/26/2015

FFT-Based Astronomical Image Registration and Stacking using GPU

Introduction to Digital Image Processing

deepatari Documentation

Programming for Engineers in Python

Euler s Method with Python

Video Article: Selecting multiple biomarker subsets with similarly effective binary classification performances

Nonlinear curve-fitting example

ADC Figure of Merit. Introduction:

MDM 4UI: Unit 8 Day 2: Regression and Correlation

NumPy and SciPy. Lab Objective: Create and manipulate NumPy arrays and learn features available in NumPy and SciPy.

PyEmir Documentation. Release Sergio Pascual, Nicolás Cardiel

Using IPython on Windows HPC Server 2008

Southern African Large Telescope

HW Assignment 3 (Due by 9:00am on Mar 6)

Catapult Open. The Open Data Cube (ODC) A tool to increase the value and impact of global Earth observation satellite data

JWST Pipeline & Data Products

JWST Pipeline & Data Products

Using the Matplotlib Library in Python 3

AMS209 Final Project: Linear Equations System Solver

Ch.1 Introduction. Why Machine Learning (ML)? manual designing of rules requires knowing how humans do it.

PINGSoft 2: an IDL Integral Field Spectroscopy Software

MeerKAT Data Architecture. Simon Ratcliffe

ipywidgets_demo July 17, Interactive widgets for the Jupyter notebook (ipywidgets)

Status of PSF Reconstruction at Lick

PACS Data Reduction Guide. issue: user, Version: 9 Nov 2011

Visual Analytics Tools for the Global Change Assessment Model. Michael Steptoe, Ross Maciejewski, & Robert Link Arizona State University

Connecting ArcGIS with R and Conda. Shaun Walbridge

solving polynomial systems in the cloud with phc

Facial Recognition Using Eigenfaces

The SciPy Stack. Jay Summet

CME 193: Introduction to Scientific Python Lecture 1: Introduction

G012 Scattered Ground-roll Attenuation for 2D Land Data Using Seismic Interferometry

LECTURE 22. Numerical and Scientific Computing Part 2

Advanced Python on Abel. Dmytro Karpenko Research Infrastructure Services group Department for Scientific Computing USIT, UiO

Data Science and Machine Learning Essentials

TelFit Documentation. Release Kevin Gullikson

Visualisation in python (with Matplotlib)

Spectral Extraction of Extended Sources Using Wavelet Interpolation

PACS Spectrometer Simulation and the Extended to Point Correction

AstroDendro Documentation

The STScI STIS Pipeline VII: Extraction of 1-D Spectra

ME30_Lab1_18JUL18. August 29, ME 30 Lab 1 - Introduction to Anaconda, JupyterLab, and Python

PyPlot. The plotting library must be imported, and we will assume in these examples an import statement similar to those for numpy and math as

Transcription:

zap Documentation Release 1.0.dev86 Kurt Soto February 03, 2016

Contents 1 Installation 3 1.1 Requirements............................................... 3 1.2 Steps................................................... 3 2 Examples 5 2.1 Sparse Field Case............................................ 5 2.2 Filled Field Case............................................. 6 3 Extra Functions 7 4 Command Line Interface 9 5 Interactive mode 11 i

ii

zap Documentation, Release 1.0.dev86 ZAP (the Zurich Atmosphere Purge) is a high precision sky subtraction tool which can be used as complete sky subtraction solution, or as an enhancement to previously sky-subtracted data. The method uses PCA to isolate the residual sky subtraction features and remove them from the observed datacube. Though the operation of ZAP is not dependent on perfect flatfielding of the data in a MUSE exposure, better results are obtained when these corrections are made ahead of time. Contents 1

zap Documentation, Release 1.0.dev86 2 Contents

CHAPTER 1 Installation 1.1 Requirements ZAP requires the following packages: Numpy 1.6.0 or later Astropy v1.0 or later SciPy v0.13.3 or later Many linear algebra operations are performed in ZAP, so it can be beneficial to use an alternative BLAS package. In the Anaconda distribution, the default BLAS comes with Numpy linked to OpenBlas, which can amount to a 20% speedup of ZAP. 1.2 Steps Once the code is downloaded, cd into the zap directory and install via:: python setup.py install 3

zap Documentation, Release 1.0.dev86 4 Chapter 1. Installation

CHAPTER 2 Examples In its most hands-off form, ZAP can take an input fits datacube, operate on it, and output a final fits datacube: import zap zap.process('input.fits', outcubefits='output.fits') Care should be taken, however, since this case assumes a sparse field, and better results can be obtained by applying masks. The main function is zap.process: There are a number of options that can be passed to the code which we describe here: The code can handle datacubes trimmed in wavelength space. Since the code uses the correlation of segments of the emission line spectrum, it is best to trim the cube at specific wavelengths. The cube can include any connected subset of these segments. (for example 6400-8200 Angstroms) [0, 5400] [5400, 5850] [5850, 6440] [6440, 6750] [6750, 7200] [7200, 7700] [7700, 8265] [8265, 8602] [8602, 8731] [8731, 9275] [9275, 10000] 2.1 Sparse Field Case This case specifically refers to the case where the sky can be measured in the sky frame itself, using: zap.process('input.fits', outcubefits='output.fits') In both cases, the code will create a resulting processed datacube named DATACUBE_ZAP.fits and an SVD file named ZAP_SVD.fits in the current directory. While this can work well in the case of very faint sources, masks can improve the results. For the sparse field case, a mask file can be included, which is a 2d fits image matching the spatial dimensions of the input datacube. Masks are defined to be >= 1 on astronomical sources and 0 at the position of the sky. Set this parameter with the mask keyword 5

zap Documentation, Release 1.0.dev86 zap.process('input.fits', outcubefits='output.fits', mask='mask.fits') 2.2 Filled Field Case This approach also can address the saturated field case and is robust in the case of strong emission lines, in this case the input is an offset sky observation. To achieve this, we calculate the SVD on an external sky frame using the function zap.svdoutput An example of running the code in this way is as follows: zap.svdoutput('offset_field_cube.fits', svdfn='zap_svd.fits', mask='mask.fits') zap.process('source_cube.fits', outcubefits='output.fits', extsvd='zap_svd.fits', cfwidthsp=50) The integration time of this frame does not need to be the same as the object exposure, but rather just a 2-3 minute exposure. Often residuals can be further reduced by changing cfwidthsp to a smaller value. However, this parameter should not be reduced to smaller than 15 pixels. 6 Chapter 2. Examples

CHAPTER 3 Extra Functions Aside from the main process, three functions are included that can be run outside of the entire zap process to facilitate some investigations. 7

zap Documentation, Release 1.0.dev86 8 Chapter 3. Extra Functions

CHAPTER 4 Command Line Interface ZAP can also be used from the command line: python -m zap INPUT_CUBE.fits More information use of the command line interface can be found with the command python -m zap -h 9

zap Documentation, Release 1.0.dev86 10 Chapter 4. Command Line Interface

CHAPTER 5 Interactive mode ZAP can also be used interactively from within IPython import zap zobj = zap.process('input.fits', interactive=true) The run method operates on the datacube, and retains all of the data and methods necessary to process a final data cube in a python class named zclass. You can elect to investigate the data product via the zclass, and even reprocess the cube with a different number of eigenspectra per region. A workflow may go as follows: import zap from matplotlib import pyplot as plt # allow ZAP to run the optimize routine zobj = zap.process('input.fits', optimization='normal', interactive=true) # plot the variance curves and the selection of the number of eigenspectra used zobj.plotvarcurve(5) # plot a spectrum extracted from the original cube plt.figure() plt.plot(zobj.cube[:,50:100,50:100].sum(axis=(1,2)), 'b', alpha=0.3) # plot a spectrum of the cleaned ZAP dataproduct plt.plot(zobj.cleancube[:,50:100,50:100].sum(axis=(1,2)), 'g') # choose just the first 3 spectra for all segmments zobj.reprocess(nevals=3) # plot a spectrum extracted from the original cube plt.plot(zobj.cube[:,50:100,50:100].sum(axis=(1,2)), 'b', alpha=0.3) # plot a spectrum of the cleaned ZAP dataproduct plt.plot(zobj.cleancube[:,50:100,50:100].sum(axis=(1,2))), 'g') # choose some number of modes by hand zobj.reprocess(nevals=[2,5,2,4,6,7,9,8,5]) # plot a spectrum plt.plot(zobj.cleancube[:,50:100,50:100].sum(axis=(1,2))), 'k') # Use the optimization algorithm to identify the best number of modes per segment zobj.optimize() 11

zap Documentation, Release 1.0.dev86 # compare to the previous versions plt.plot(zobj.cleancube[:,50:100,50:100].sum(axis=(1,2))), 'r') # identify a pixel in the dispersion axis that shows a residual feature in the original plt.figure() plt.matshow(zobj.cube[2903,:,:]) # compare this to the zap dataproduct plt.figure() plt.matshow(zobj.cleancube[2903,:,:]) # write the processed cube as a single extension fits zobj.writecube('datacube_zap.fits') # or merge the zap datacube into the original input datacube, replacing the data extension zobj.writefits(outcubefits='datacube_final_zap.fits') 12 Chapter 5. Interactive mode