Software and combined performance

The Sheffield ATLAS Group contributes to several aspects of the software and combined performance work in ATLAS. Our contribution was essential in the development of the algorithm and tools that are used for the ATLAS physics analysis.


Our expetise:

  • Development and maintenance of the electron/photon reconstruction software
  • Development and operation of the general reconstruction software
  • Validation of the ATLAS software
  • Development of new particle-flow algorithms to reconstruct ATLAS events

Software development and computing work

The Sheffield group is at the centre of the software development and maintenance for the ATLAS experiment. At present we contribute to the reconstruction and release deployment work:

  • Offline and prompt reconstruction coordinators (Hodgkinson, Donszelmann)
  • Offline release management and core software (Vickey-Boeriu)

In addition in Sheffield over the years we have been in charge of

  • The validation of the software and the assessment of its usability for physics analysis (software and physics validation)
  • The event data model and the analysis model
  • The overall coordination of the software for the ATLAS upgrade

Sheffield is also part of the Northgrid Tier2 and we are a reliable provider of computing resources via GridPP

Electron and photon reconstruction

The remarkable discovery of the Higgs boson in the experimentally harsh LHC environment would not have been possible without the deep understanding of the detector performance and optimisation of the reconstruction and identification algorithms.

An electron or photon is reconstructed using energy deposits in the calorimeter (clusters). For electron inner detector tracks are matched to the cluster providing information on the particle’s origin and direction and therefore of its momentum

Members of the Sheffield ATLAS group (Anastopoulos, Donszelmann) have been playing a leading role in the area electron performance and co-ordinated the development of the new electron reconstruction algorithm used by the ATLAS experiment in the Higgs discovery.

This improved electron reconstruction resulted of 30% in the expected sensitivity for the Higgs discovery in the H → ZZ(*) → 4l channel.

Currently, Christos Anastopoulos of the Sheffield ATLAS group is the convener of the electron and photon combined performance of the ATLAS Collaboration.

Diagram showing the components of the electron reconstruction algorithm used by the ATLAS experiment

Particle flow in ATLAS

The Sheffield group has developed a particle flow algorithm for use within ATLAS, which makes use of all detector information to optimise the measurement of hadronic objects such as jets and taus.

The package, called eflowRec, is currently being studied for use within jet and met reconstruction, and was run in the 2015 data (and mc15 Monte Carlo) reconstruction processing. An improved version of the algorithm will be run in the 2016 data processing. eflowRec has been written within the athena offline framework in C++ and uses a python frontend to configure at run-time.

A paper will be submitted to a journal in the coming months detailing the 2015 version of the algorithm, along with an ATLAS CONF note detailing the updates for 2016.

Mark Hodgkinson is currently co-coordinator of the ATLAS JetETMiss particle flow task force. Details of the particle flow algorithm can be found in this recent paper submitted to EPJC. An ATLAS physics briefing, which describes the main ideas behind this technique, has also recently been published.

Tau reconstruction in ATLAS

We have also worked closely with physicists in the ATLAS Bonn group in Germany to develop techniques based on particle flow to reconstruct tau substructure. This substructure reconstruction can be used to measure the properties of the Higgs, when decaying to a di-tau state.

A tau specific implementation of eflowRec was developed as one of several candidate algorithms for reconstruction of tau substructure, which can be used as input to the PANTau software developed at Bonn to perform tau decay mode classification.

However it turned out, given the tau substructure is much less complex than that of jets, that a much simpler and more flexible algorithm, developed at Bonn, worked well.

Sheffield led the design of the data storage software, which is jointly used by this algorithm and the generalised particle flow algorithm deployed for jets and missing ET. The tau scenario is a special case, where the well understood tau kinematics can be used to guide the charged shower subtraction procedure for a given particle flow algorithm.

Details of the tau particle algorithm can be found in this recent paper published by EPJC.