Nimble telescope design: Argus’s design has gone through many iterations, dramatically reducing costs compared to our starting point, and validated at each step with on-sky prototypes.
The Argus Array’s Prototype Series
Argus-ADDEM (2025+) is designed to rapidly produce the long-term data required to validate the Argus coaddition pipelines over the full survey length.
With six co-pointed telescopes, ADDEM accumulates data on a very small area of the sky six times more rapidly than the full array will, allowing year-timescale tests of the full array depth.
Above: Graduate students Lawrence Machia, Tommy Proctor, Mae Dubay, and Will Marshall, with Argus Project Manager Alan Vasquez Soto, after deployment of Argus-ADDEM at Sierra Remote Observatory. Right: Argus-ADDEM on-sky.
Argus Pathfinder (2022-24) was a 38-telescope array designed to retire development risks associated with the Argus Array’s unique design (Vasquez+2022; Machia+2022; Law+2022b; Galliher+2022, Corbett+2022b). Pathfinder has the full pseudofocal sealed-enclosure design, and for dome-seeing effects its enclosure acted as a 40%-path-length scale-model of the full Array’s enclosure. After completion in Chapel Hill, in December 2022 we deployed Pathfinder to the PARI observatory in the Appalachian mountains in North Carolina for commissioning. Pathfinder operated at PARI for one year demonstrating the long-term performance, reliability and safety of its systems.
The Argus Technology Demonstrator (2021-22) was designed to develop the tracking drive system, polar alignment methods, obtain data for pipeline development, and train graduate students on the team. The system is described in more detail in Corbett+2022a, Gonzalez+2022, Vasquez+2022.
The Argus Array concept, covering the entire sky simultaneously with an array of telescopes, has been pioneered by our Evryscope telescopes in Chile (2015+) and California (2018+). The systems together observe 16,512 square degrees with 1.4 GPix in each two-minute exposure, for a total of 28,000 square degrees for two to six hours each night. The single-exposure limiting magnitude is mg=15-16, depending on conditions.
The Sourthern-Evryscope construction was funded by NSF-ATI; the Northern Evryscope was funded by a collaboration with San Diego State University. Evryscope operations were funded by NSF-CAREER and NSF-AAG. The Evryscopes’ pipelines produced a) light curves with 105 epochs per year for 15M objects across the sky and b) real-time transient detections with neural-network-based false-positive discrimination that has enabled the fastest-yet searches for gravitational wave counterparts (Corbett et al. 2019) and rapid stellar superflare follow-up (Corbett et al. 2020).
The Evryscope science programs have produced the most stringent measurements of bright, rare stellar flares which affect planetary habitability (Howard et al. 2018, Howard et al. 2019a, Howard et al. 2019b, Glazier et al. 2020), main-sequence-star planet-detection-survey and eclipsing binary discoveries (Ratzloff et al. 2019a, Galliher et al. 2020), discoveries of exotic stellar binaries around white dwarfs and hot subdwarfs (Ratzloff et al. 2019d, Ratzloff et al. 2019e, Ratzloff et al. 2020a), technical papers (Law et al. 2015, Law et al. 2016 Ratzloff et al. 2016, Ratzloff et al. 2019b, Ratzloff et al. 2019c), and other results from the wider astronomical community (Tokovinin et al. 2018, Kosiarek et al. 2018).
Real-time data reduction for gigapixel astronomy: We collected four million 29MPix images from the Evryscope cameras, totaling half a petabyte of raw data, over a trillion photometric measurements in 30,000 to 40,000 epochs, depending on sky position. Each two-minute Evryscope observation covers a 16,512 sq.deg., 5 GB image field, adding up to an average of 1.2 TB of calibrated, science-ready data per night. To deal with this data volume and rate, we have developed a fast, robust software pipeline to meet our science requirements for few-minute-latency transient alerts and long-term, high-precision light curves with wide-field images containing extreme PSF variation. This pipeline forms the precursor system for the Argus data analysis.
The Evryscope pipelines are responsible for image quality vetting, calibration, astrometry, image collation in a large database, and the production of long-term, high precision light curves. This pipeline includes i) a bespoke astrometric fitting routine; ii) a compound flat-fielding technique using both photometric and twilight-flat measurements; iii) a scratch-built forced aperture photometry pipeline; iv) a light curve detrending system with verified, published detections of signals as small as a few millimagnitudes; v) a high-performance, purpose-built data storage architecture that scales to up to hundreds of terabytes of photometric data; and vi) robust machine-learning-based vetting of transient counterparts. To support multiple reductions of our long-term dataset, the pipeline is optimized to the point it can operate much faster than real time. All reduction is done locally at the telescope on 64-core analysis servers and 384 GiB of RAM. The pipeline delivers vetted, calibrated 60 MPix images with 0.1-pixel-precision astrometry with a median latency of 28 s (99.5% of images are delivered within 108 s). The Evryscope data-analysis pipeline pioneered new technologies that will enable the Argus Array software, including an ultra-fast flat-binary file format for the rapid storage and retrieval of petabyte-scale precision photometry, precision flat-fielding for wide-field images subject to sky-brightness gradients, custom high-speed astrometric solvers based on expected image distortions for each telescope, rapid image-quality metrics for automonous bad-image rejection from coadditions, sky-partitioned light curve and catalog storage, ConvNet-based machine-learning systems for rapid false-positive discrimination, and high-speed Evryscope-image subtraction.
The Argus Array data analysis systems are based on the lessons learned in our implementation of the Evryscope data analysis system. Based on GPU computing and a robust data transfer architecture, the Argus pipelines are capable of orders of magnitude faster data analysis.