Visualising HEP event data is currently typically done per experiment (e.g. VP1, Iguana, Fireworks), and normally involves the installation of dedicated software. However modern browsers are more than capable of showing complex detector geometry, as well as representations of the underlying physics. As the Visualisation section of HSF Community White Paper explained, using an intermediate data format (e.g. JSON) makes it possible to separate the event display from the underlying (experiment-specific) software framework.
Phoenix is a framework that can be used by any typical (e.g. colliding beam) High Energy Physics experiment. It was initially based on work done for the TrackML Kaggle/Codalab challenges (and internal use by ATLAS).
In the recent Advanced Computing and Analysis Techniques (ACAT) 2021 conference, two submissions cited Phoenix:
GSOC students in 2019 and 2020 ported Phoenix to angular, wrote and implemented a new menu system and much more. Most base functionality now exists for the core display - the task now is to polish this, and develop better the VR display.
We have three possible tasks this year. All are expected to take 175 hours (i.e. are short projects).
Phoenix currently supports (in a quite basic form) VR and AR and even with the limited functionality we have right now, this is a very exciting way to use the event display and it is something we would really like to improve.
However one current omission is there is no way to interact with the detector and event data visualisation: at the moment in AR/VR the user has no way to change what is shown,
Phoenix’s menu system was always intended to be extensible, and so it should be possible to add a 3D menu when in VR mode. The major complication here will be that visualising detectors and event data on a phone is a very different experience to using it in a dedicated headset (such as a Meta Quest). Much care will need to be taken to make the UI as intuitive as possible. Once this is made to work, we can investigate adding a simple menu for AR mode (AR is more difficult, since it is less stable).
Key4HEP is a new software stack being developed for future particle physics experiments. An important part of it is EDM4HEP, “a generic event data model (EDM) for future High Energy Physics (HEP) collider experiments”. What this means in practice is it is a way of representing event data for potential future experiments, and we would like to be able to read this in Phoenix.
Phoenix is designed to be extensible via “loaders”, basically small interpreters that convert arbitrary data formats into Phoenix’s native event data model, so the concrete task would be to write a “loader” which understands EDM4HEP.
Phoenix consists of two packages:
phoenix-event-display
to provide UI for modifying the scene.The current tests for both the packages are written using Jasmine and Karma. While Jasmine and Karma are fairly popular and feature-packed, the browser-oriented nature of Karma makes the tests very slow, especially for phoenix-event-display
which doesn’t necessarily need a browser.
In the case of phoenix-ui-components
, the tests use the Angular testing setup which is based on Jasmine and Karma. While the setup is correct and the tests run fine, they need a serious revamp. The idea would be to update the current test setup and make sure it conforms to the latest Angular version. And (most of) the tests would need to be rewritten to follow Behaviour Driven Development and be more meaningful.
Integration tests also need to be written for phoenix-app which is the actual Phoenix application and uses phoenix-ui-components
.
Phoenix does not have any end-to-end tests, so they also need to be set up and written. Possible options are Cypress and Protractor.
phoenix-event-display
and phoenix-ui-components
.Angular, Typescript, Web development (GUI design experience and threejs knowledge a bonus).