ATLAS is a particle physics experiment at the Large Hadron Collider at CERN that is searching for new discoveries in the head-on collisions of protons of extraordinarily high energy. ATLAS will learn about the basic forces that have shaped our Universe since the beginning of time and that will determine its fate. Among the possible unknowns are extra dimensions of space, unification of fundamental forces, and evidence for dark matter candidates in the Universe. Following the discovery of the Higgs boson, further data will allow in-depth investigation of the boson's properties and thereby of the origin of mass.
Late 2009 -- Startup of LHC and first event collisions at a total energy of 0.9 TeV and later at 2.36 TeV (above the previous world record).
March 2010 -- Event collisions at a total energy of 7 TeV. This led to about eight months of data taking before a few weeks of heavy ion collisions and the usual winter shutdown. Many papers with early results have come as a result of the 2010 run.
March 2011 -- A year of intensive data taking was followed by a few weeks of heavy ion collisions and a winter shutdown. Event collisions at a total energy of 7 TeV. (Dec. 2011 - Apr. 2012).
April 2012 -- Event collisions at a total energy of 8 TeV. A year of intensive data taking will be followed by a few weeks of heavy ion collisions.
2013 -- A long shutdown to prepare for an increase of the total energy towards 14 TeV.
Next 2015-2030 years -- Continued data taking with publication of results on an ongoing basis.
Non-Reproducible data exist in two or more geographically disparate copies across the WLCG. The site bit preservation commitments are defined in the WLCG Memorandum of Understandin. All data to be reprocessed with most recent software to ensure longevity.
Non-reproducible: RAW physics data, calibration, metadata, documentation and transformations (jobs).
Derived data: formats for physics analysis in collaboration, formats distributed for education and outreach. Greatly improved by common derived data production framework in run 2. Published results in journals and HEPDATA. Sometimes with analysis published in Rivet and RECAST. Format lifetimes are hard to predict, but on current experience are 5-10 years, and changes are likely to coincide with the gaps between major running periods.
Software provenance of derived data stored in AMI database. Numerous twikis available describing central and analysis level software. Interfaces such as AMI and COMA contain metadata.
The publications themselves are produced via the physics result approval procedures set out in ATL-GEN-INT-2015-001 held in CDS; this sets out in detail the expected documentation within papers and the supporting documentation required.
Compiled libraries and executable of the “Athena” framework are published on CVMFS. Software versioning is maintained on the CERN subversion server.
Main usage of data: future analysis within the collaboration
Further usage: review in collaboration and potential for outreach
Re-use of data (new analyses) within the collaboration, open access sharing of curated data
Publications by the collaboration. Training of PhDs
Unique data sets (both pp and HI) being acquired between now and 2035. Similar data only acquired by other LHC experiments
The active collaboration shares the operational costs with the WLCG computing centres.
ATLAS replicates the non-reproducible data across the WLCG and maintains database of software provenance to reproduce derived data. Plans to bring run 1 data to run 2 status. Master-classes exercises available on CERN Open Data Portal, expansion considered. Some analyses published on Rivet/RECAST.
Person-power within the experiment is hard to find. Validation of future software releases against former processing crucial. No current plans beyond the lifetime of the experiment.
On-going development of RECAST with Rivet and collaboration with CERN IT and the other LHC experiments via the CERN Analysis Portal as solution to problem of analysis preservation.