Adaptive selfcalibration for Allen Telescope Array imaging

Similar documents
ASKAP Data Flow ASKAP & MWA Archives Meeting

Fast Holographic Deconvolution

Correlator Field-of-View Shaping

High Dynamic Range Imaging

Controlling Field-of-View of Radio Arrays using Weighting Functions

Imaging and Deconvolution

NRAO VLA Archive Survey

ALMA Memo 386 ALMA+ACA Simulation Tool J. Pety, F. Gueth, S. Guilloteau IRAM, Institut de Radio Astronomie Millimétrique 300 rue de la Piscine, F-3840

OSKAR-2: Simulating data from the SKA

EVLA Memo #132 Report on the findings of the CASA Terabyte Initiative: Single-node tests

COMMENTS ON ARRAY CONFIGURATIONS. M.C.H. Wright. Radio Astronomy laboratory, University of California, Berkeley, CA, ABSTRACT

MeerKAT Data Architecture. Simon Ratcliffe

A Scalable, FPGA Based 8 Station Correlator Based on Modular Hardware and Parameterized Libraries

ASKAP Central Processor: Design and Implementa8on

Developing A Universal Radio Astronomy Backend. Dr. Ewan Barr, MPIfR Backend Development Group

Data Processing for the Square Kilometre Array Telescope

Computational issues for HI

PoS(10th EVN Symposium)098

CASA Status. Crystal Brogan (subsystem scientist) ASAC Meeting, March 9, 2010 Mitaka, Japan. NRAO/North American ALMA Science Center

Powering Real-time Radio Astronomy Signal Processing with latest GPU architectures

Status hardware, software and what next? Harry Keizer 1, Simon Bijlsma 1, Daniela de Paulis 1,3,Marc Wolf 1,2.

MWA Real-Time Subsystem

The Programmable Telescope

Synthesis Imaging. Claire Chandler, Sanjay Bhatnagar NRAO/Socorro

OSKAR: Simulating data from the SKA

Argus Radio Telescope Architecture

High dynamic range imaging, computing & I/O load

CASA. Algorithms R&D. S. Bhatnagar. NRAO, Socorro

Visualization & the CASA Viewer

MASSACHUSETTS INSTITUTE OF TECHNOLOGY HAYSTACK OBSERVATORY

S.A. Torchinsky, A. van Ardenne, T. van den Brink-Havinga, A.J.J. van Es, A.J. Faulkner (eds.) 4-6 November 2009, Château de Limelette, Belgium

The Square Kilometre Array. Miles Deegan Project Manager, Science Data Processor & Telescope Manager

Advanced Radio Imaging Techniques

AA CORRELATOR SYSTEM CONCEPT DESCRIPTION

Signal processing with heterogeneous digital filterbanks: lessons from the MWA and EDA

Radio Interferometry Bill Cotton, NRAO. Basic radio interferometry Emphasis on VLBI Imaging application

Radio astronomy data reduction at the Institute of Radio Astronomy

Pre-Processing and Calibration for Million Source Shallow Survey

Overview of Science Data Processor. Paul Alexander Rosie Bolton Ian Cooper Bojan Nikolic Simon Ratcliffe Harry Smith

Self-calibration: about the implementation in GILDAS. Vincent Piétu IRAM. IRAM millimeter interferometry summerschool

The Breakthrough LISTEN Search for Intelligent Life: A Wideband Data Recorder for the Robert C. Byrd Green Bank Telescope

ERROR RECOGNITION and IMAGE ANALYSIS

SKA Computing and Software

John W. Romein. Netherlands Institute for Radio Astronomy (ASTRON) Dwingeloo, the Netherlands

Wide field polarization calibration in the image plane using the Allen Telescope Array

The UniBoard. a RadioNet FP7 Joint Research Activity. Arpad Szomoru, JIVE

Image Pixelization and Dynamic Range

The Design and Applications of BEE2: A High End Reconfigurable Computing system

ASKAP Pipeline processing and simulations. Dr Matthew Whiting ASKAP Computing, CSIRO May 5th, 2010

SKA Technical developments relevant to the National Facility. Keith Grainge University of Manchester

Data Compression for HERA

Sky-domain algorithms to reconstruct spatial, spectral and time-variable structure of the sky-brightness distribution

Imaging and Deconvolution

Lecture 17 Reprise: dirty beam, dirty image. Sensitivity Wide-band imaging Weighting

Imaging and Deconvolution

Netherlands Institute for Radio Astronomy. May 18th, 2009 Hanno Holties

Advanced Multi-Beam Spect rom et er for t he GBT

Development of Focal-Plane Arrays and Beamforming Networks at DRAO

Imaging Strategies and Postprocessing Computing Costs for Large-N SKA Designs

The Argus 2002 Architecture

Memo 126. Strawman SKA Correlator. J.D. Bunton (CSIRO) August

Analysis of the Parallelisation of the Duchamp Algorithm

SKA Low Correlator & Beamformer - Towards Construction

ADVANCED RADIO INTERFEROMETRIC IMAGING

Summary of Data Management Principles

CorrelX: A Cloud-Based VLBI Correlator

SDP Memo 40: PSRFITS Overview for NIP

ASTRI/CTA data analysis on parallel and low-power platforms

How accurately do our imaging algorithms reconstruct intensities and spectral indices of weak sources?

The Canadian CyberSKA Project

PTA Control System Architecture

style Click to edit Master text Click to edit Master subtitle style Rodrigo Améstica Click to edit Master text styles

Data Handling and Transfer for German LOFAR Stations

SKA Regional Centre Activities in Australasia

THE VLBA SENSITIVITY UPGRADE

Machine Learning & Science Data Processing

Studying GPU based RTC for TMT NFIRAOS

SOP for testing PACKETIZED CORRELATOR.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY HAYSTACK OBSERVATORY WESTFORD, MASSACHUSETTS May 2011

THERE is a growing need for high-performance realtime

ALMA Antenna responses in CASA imaging

Deconvolution and Imaging ASTR 240: In-class activity, April 1, 2013

Data Analysis. I have got some data, so what now? Naomi McClure-Griffiths CSIRO Australia Telescope National Facility 2 Oct 2008

SOP for testing 4/8 antenna PACKETIZED CORRELATOR.

PARALLEL PROGRAMMING MANY-CORE COMPUTING: THE LOFAR SOFTWARE TELESCOPE (5/5)

Mark 6: design and status update

European VLBI Network

Imaging and non-imaging analysis

Primary Beams & Radio Interferometric Imaging Performance. O. Smirnov (Rhodes University & SKA South Africa)

POSSUM: analysis for early science and beyond

CASPER AND GPUS MODERATOR: DANNY PRICE, SCRIBE: RICHARD PRESTAGE. Applications correlators, beamformers, spectrometers, FRB

FPGA APPLICATIONS FOR SINGLE DISH ACTIVITY AT MEDICINA RADIOTELESCOPES

The ALMA Pipeline. Introduction Pipeline Subsystem Heuristics Task Interface Tests / Commissioning Results Demos. D. Muders, MPIfR L.

Using CASA to Simulate Interferometer Observations

MWA Correlator Status. Cambridge MWA Meeting Roger Cappallo

Continuum error recognition and error analysis

PARALLEL PROGRAMMING MANY-CORE COMPUTING FOR THE LOFAR TELESCOPE ROB VAN NIEUWPOORT. Rob van Nieuwpoort

The Virtual Observatory and the IVOA

AST3 Cameras, a Status Update

Data Intensive Science Impact on Networks

Transcription:

Adaptive selfcalibration for Allen Telescope Array imaging Garrett Keating, William C. Barott & Melvyn Wright Radio Astronomy laboratory, University of California, Berkeley, CA, 94720 ABSTRACT Planned instruments such as the Atacama Large Millimeter Array (ALMA), the Large Synoptic Survey Telescope (LSST) and the Square Kilometer Array (SKA) will measure their data in petabytes. Innovative approaches in signal proccessing, computing hardware, algorithms, and data handling are necessary. In this paper we describe the automated data processing to handle the high data rate and RI in close to real time at the ATA. Automated data quality control, monitor and editing are essential for real time data processing. Three subsystems: RI excision, calibration, and imaging, function with limited or no a-priori information about the sky or telescope system, using a-priori calibrations, iteratively flagging data and imaging the sky to build a model for the calibrator field and self-calibrating to that model. 1. Introduction Data processing poses significant problems for large telescope arrays. The user is primarily interested in astronomy, and less interested in details of the data acquisition and data processing. Next generation radio telescopes, such as ALMA and SKA will only achieve their full potential if they can easily be used by non radio astronomy specialists. In many cases the desired output is calibrated images and phased array beams. Aperture synthesis imaging in radio astronomy has been developed with arrays of 10 to 100 antennas. In the current data processing paradigm, the cross correlations are formed in on-line, custom designed correlator hardware. The correlations and calibration data are written to datasets which are transferred to off-line computers where the calibrations are applied and images are made using data reduction packages. There is a severe mismatch between the data rates in the on-line correlator hardware and those supported by the off-line processing which can typically handle only a few percent of the data rates large correlators are capable of producing. This can be resolved by integrating the calibration and imaging into the data acquisition process. or phased array beams, the signals from multiple telescopes are combined and the calibration must be made in real time, as the data steams from the individual telescope elements are not stored. RI is best handled in close to real time. In this paper we describe the automated data processing to handle the high data rate and RI in close to real time at the ATA.

2 ig. 1. The ATA 42-antenna array at Hat Creek, CA 2. Allen Telescope Array The Allen Telescope Array (ATA) is an aperture synthesis array of 6-m diameter antennas equipped with broadband, dual polarization receivers from 0.5 to 11 GHz.. Multiple backends allow simultaneous imaging observations, pulsar observations, point source spectroscopy, and SETI searches. The wide field-of-view, ability to observe anywhere in the instantaneous bandwidth of the feed, and large blocks of observing time are ideally suited to making large surveys and to monitoring temporal variations in a wide variety of sources. 2.1. Correlators We use an correlator architecture: voltage signals from each antenna and each polarization are divided into 1024 frequency channels using a polyphase filter bank algorithm ( stage), and then cross-multiplied to measure the correlation properties of the incident radiation ( stage). The current ATA has 4 separate correlators with a 100 MHz bandwidth. There is a custom backplane between the and sections of the correlator and a maximum sample rate of 1Hz. In the next generation of DSP, we will use a a packetized 500 MHz bandwidth correlator developed in collaboration with the CASPER 1 group using commercial 10-Gbit Ethernet (10-GbE) switches to route the data through the correlator and enable sample rates of 100-1000 Hz into a cpu/gpu cluster. 2.2. Beamformers The ATA currently has 3 beamformers processing 100 Mhz of bandwidth into phased array beams for pulsar, and SETI analysis backends. 1 Collaboration for Astronomical Signal Processing & Engineering Research; http://casper.berkeley.edu

3 2.3. Real-Time Imager Realtime imaging is required to handle the high data rates from large surveys and transient phenomena. or wide-field synthesis arrays with a large number of antenna elements the data rates are too large for continuous data acquisition. SKA Memo 60 (Wright, 2005) discusses pertinent issues regarding the design of a realtime system. We have implemented a realtime imaging pipeline with the ATA with sampling at 10s and imaging at 100s (?). 2009 AAS 214th Meeting (601.06) Results from the Allen Telescope Array: Real-Time Imaging, Garrett Keating1, J. Barott2, Allen Telescope Array Team 1 University of California, Berkeley, 2 Embry-Riddle Aeronautical University. The high data rates required for transient imaging, and larger arrays such as SKA require seamless integrating the imaging and beam formation into the data acquisition process. Images can be made simultaneously in multiple regions within the field of view by integrating the output from the correlators at multiple phase centers on targets of interest, calibration sources, and sources whose sidelobes confuse the regions of interest. Calibration in close to realtime uses a model of the sky brightness distribution. The derived calibration parameters are fed back into the imagers and beam formers. The regions imaged update and improve the a-priori model, which becomes the final calibrated image by the time the observations are complete (ig. 2).... ~ Packetized Ethernet Switch G I P V V Gains( s,f,p) Model( s,f,p) PBeam( s,f,p) Bandpass( f ) PolCal( f,p) I1 p V ˆ(ˆ s f ), 1 I2 p V ˆ(ˆ s f ) Solver j k, 2 j k Beam Solver Imager Astronomy Control ig. 2. Data low in a realtime imaging system. Measured visibilities are compared with model visibilities and the residuals are used to update the model. This model becomes the final image. The next generation of DSP on the ATA will be a Correlator-Beamformer-Imaging engine

4 (CoBI) which will integrate the calibration and imaging into the data acquisition process by streaming uv-data and associated metadata to a beowulf cluster using standard 10-GbE packetized data and switches. or fast transients studies analysis will proceed. or slower data rates the original data can be stored and re-used at high bandwidths until the best possible calibration has been obtained. 2.4. ARTIS: Experience with Realtime Imaging on the ATA Automated data quality control, monitor and editing are essential for realtime data processing. During the last three years we have developed the Automated Real Time Imaging System (ARTIS) for the ATA. ARTIS has three subsystems: RI excision, calibration, and imaging (?). Each system meets three requirements: processing time cannot exceed the observation time; each must be robust enough that bad results are corrected; and each must function with limited or no a-priori information about the sky or system. ARTIS is now routinely used on the ATA. ARTIS uses calibration observations to determine amplitude, phase, and bandpass corrections for each antenna, iteratively flagging data that exceed RMS scatter or closure limits for both amplitude and phase. ARTIS then images the calibrator data, building a more complex model for the calibrator field and self-calibrating to that model. During the imaging cycle, ARTIS uses the calibrated data to find a flagging, deconvolution and self-calibration solution. ARTIS maximizes image quality by evaluating dynamic range and image fidelity. During the deconvolution cycle, ARTIS first uses CLEAN to derive a preliminary deconvolution model, and then calculates which deconvolution method (CLEAN or MEM) and number of iterations to use, testing the deconvolution solution by looking for point sources and noise statistics in the residual map. ARTIS then establishes upper and lower amplitude limits for visibilities, and flags outside these limits. ARTIS finally selfcalibrates on the image, and repeats the imaging cycle to maximize the dynamic range and image fidelity. The experience we have gained with these early tools provides solid bases for the next stages of realtime imaging with large-n arrays. ARTIS is based on the MIRIAD package, which is suitable for extension to the multi-cpu environment required for CoBI. Many of the problems of calibration and imaging are embarrassingly parallel. Significant parallelization can be made through multiple threads rather than the much more complex task of development of massively interconnected software. 2.5. Computing Requirements for ast Transient Source Surveys A big challenge in fast transients source surveys is providing the significant computing resources. We will sample visibilities on time scales down to 10-ms, image in individual channels, and search for images for individual pulses with a range of dispersion measures (DM). Data rate

5 from the CoBI on ATA-42 will be 22 GB/s with all baselines, all channels, and all Stokes parameters. The data rates can be reduced by matching channelization to the maximum DM explored. Post-RI subtraction will reduce the data rate to 0.2 to 0.7 GB/s. The volume of data requires an efficient and robust processing pipeline. We also include a large data archive ( 60 TB) that can archive 1 to 3 days of data, enabling us to post-process some data or deal with experiments in which the instantaneous data rate exceeds that of our computing cluster. This imaging approach gives a conservative estimate of computational demand. We estimate the number of operations for each image as N ops = M + N 2 log 2 (N), where M is the number of baselines, and N is the number of pixels on a side of an image. Incoherent de-dispersion requires coadding images across the band and adjusting for relative delays of roughly a second, or 10 2 integrations. Typical computational rates are 1 Tflop. This is comparable to the capability of the proposed computing cluster, estimated at 0.5 to 1 Tflops. On-the-fly processing and/or short-term data archiving are possible. 3. References Title: Real-Time Calibration of the Murchison Widefield Array Authors: D. A. Mitchell (1), L. J. Greenhill (1), R. B. Wayth (1), R. J. Sault (2), C. J. Lonsdale (3), R. J. Cappallo (3), M.. Morales (4), S. M. Ord (1) ((1) Harvard-Smithsonian Center for Astrophysics, (2) University of Melbourne, (3) MIT Haystack Observatory, (4) University of Washington) Comments: 11 pages, 5 figures. Accepted for the October issue of the IEEE Journal of Selected Topics in Signal Processing, Special Issue on Signal Processing for Astronomical and Space Research Applications