HSF-OSG-WLCG Workshop at Jefferson Lab, HOW2019

The annual HEP Software Foundation workshop happened in Jefferson Laboratory from the 18th to 22nd of March. This year we had the opportunity to join forces again with the Worldwide LHC Computing Grid (WLCG) community and, in addition, the US Open Science Grid (OSG). Almost 250 scientists, from LHC, HEP and non-HEP communities joined the meeting.

On the first day we opened the meeting with an excellent introduction to JLab from lab director Stuart Henderson and an overview of JLab computing and software from Raffaella De Vita. That was followed by plenary talks from the LHC experiments, other HEP experiments and many non HEP communities, including the next generation US nuclear physics facility, the Electron-Ion Collider. That set the stage for the computing and software challenges we face in data intensive science for the next decade.

JLab Workshop Group Photo
Workshop Participants. Photo © DOE Jefferson Laboratory

The theme of working more closely with other sciences was underlined by the discussion on the Evolution of the WLCG Collaboration on Monday afternoon. Sharing an infrastructure for big data sciences, building on what we know and already have, received wide support, but the details of how to manage this, for all communities, need to be worked out.

Common sessions for HSF and WLCG on Tuesday looked at the evolution of technology, based on the impressive work done by the HEPiX Benchmarking WG. Processors, storage and networking are all changing and HEP is having to adapt to that, as well as making more and more use of HPC facilities. As HPCs equip themselves massively with compute accelerators this led very naturally to the afternoon HSF session on Software for Accelerators. These devices are very different from CPUs, for which we have written most of our software for up to now, and pose serious challenges for developers. Integration with CPU frameworks and finding the best way to maintain code for a heterogeneous future were among the topics where the HSF will continue to work to identify prototypes and share best practice. ALICE showed how they were using GPUs to achieve the required throughput in Run 3. LHCb, who also face the stiff test of increased throughput in Run 3, is actively doing R&D work on GPUs and presented encouraging results.

JLab Workshop Group Photo
Plenary Discussions. Photo © DOE Jefferson Laboratory

Wednesday and Thursday saw the time of the HSF Working Groups to organise their sessions. Our three new working groups were the stars of the show and the quality of the sessions they organised were a testament to how much good work and preparation has been done since the start of the year.

Detector Simulation looked at everything from physics improvements for the future to the speed boosts that we need and how we can get them. The GeantV vectorisation R&D presented important results and the approximate methods for fast simulation were discussed, including progress in using machine learning.

Data Analysis presented a summary of what we learned from their topical workshops, with new approaches for the future. Declarative analysis is being explored in many R&Ds now, and given the uncertainty in computing architectures for the future, this is a topic worth investigating.

Reconstruction and Software Triggers looked at the increasing tendency to produce analysis quality output close to the detector, both in time and in space, so called Real Time Analysis. That touched again on integrating compute accelerators, such as FPGAs as a way to do complex inference within budget.

JLab Real Time Analysis Talk ATLAS JLab Real Time Analysis Talk CMS
RTA Presentations. Photos © Caterina Doglioni

Many of our other HSF working groups also organised sessions. Education and Training is still a major challenge for the community. A survey of what the training needs are for HEP provides valuable input for how we organise schools and training in the future. The LHCb StarterKit programme continues to shine as an example of bottom-up training that is an inspiration for many other experiments.

The PyHEP group organised a session that explored our links with the wider Python community, with an emphasis on toolset approaches where different tools mesh together to form the required pipeline. There was also a presentation from outside HEP, with Jonathan Helmus from Anaconda introducing the numba Python JIT and the Conda package distribution. In the latter our own community has contributed ROOT on Linux and OS X platforms, which is already very popular.

The theme of packaging was touched on again in the Software Development Tools session. The HSF Packaging WG presented solutions that support the wider science community and look like a good bet for the future. Closer to the code-face, presentations on profiling and static analysis provided developers with good advice about the best tools to use.

On Friday the sessions turned back to plenary mode and we heard from projects being funded to provide the investment in software and computing that we so very much need. It was therefore very appropriate to announce that the HSF Community White Paper Roadmap was finally published in Computing and Software for Big Science during the week of the workshop.

That led us very neatly to a closing talk from JLab’s Amber Boehnlein, on her thoughts about the future of computing in the field. Amber was the main local organiser of the workshop and we were very happy to warmly thank her and the rest of the team on a job well done. The dinner we enjoyed in the local Mariners’ Museum was greatly appreciated and offered a great backdrop for continued discussions. We all enjoyed the early Spring meeting at JLab and already look forward to next year’s event.

JLab Workshop Dinner
Dinner in the Mariners' Museum. Photo © DOE Jefferson Laboratory