[ieee 2014 ieee virtual reality (vr) - minneapolis, mn, usa (2014.03.29-2014.04.2)] 2014 ieee...

2
CAVE Visualization of the IceCube Neutrino Detector Ross Tredinnick 1,2* James Vanderheiden 1 Clayton Suplinski 1 James Madsen 3,4† 1 University of Wisconsin - Madison 2 Wisconsin Institute for Discovery 3 University of Wisconsin - River Falls 4 WIPAC ABSTRACT Neutrinos are nearly massless, weakly interacting particles that come from a variety of sources including the sun, radioactive de- cay and cosmic rays. Neutrinos are unique cosmic messengers that provide new ways to explore the Universe as well as opportunities to better understand the basic building blocks of matter. IceCube, the largest operating neutrino detector in the world, is located in the ice sheet at the South Pole. This paper describes an interactive VR application for visualization of the IceCube’s neutrino data within a C6 CAVE system. The dynamic display of data in a true scale recreation of the light sensor system allows events to be viewed from arbitrary locations both forward and backward in time. Initial feedback from user experiences within the system have been pos- itive, showing promise for both further insight into analyzing data as well as opportunities for physics and neutrino education. Index Terms: H.5.1 [Information Presentation]: Multimedia In- formation Systems—Artificial, augmented, and virtual realities; 1 I NTRODUCTION AND BACKGROUND IceCube, a recently constructed neutrino detection system consists of 5,160 optical sensors, called Digital Optical Modules (DOMs), located between 1,450 and 2,450 meters deep in the ice and spread across a cubic-kilometer of space. When neutrinos interact with atoms in the ice, Cherenkov radiation produces faint flashes of blue light whose photons are recorded by a photomultiplier tube within the DOM [7]. The IceCube team defines a series of these light detections across several DOMs as an “event”. A CAVE applica- tion for visualizing events was attractive to the IceCube team since their data could be analyzed in a true scale environment, giving a perspective unattainable with traditional desktop tools [3]. Further- more, a CAVE application would allow more people to experience the detection system. Therefore, the goal of the application is to visualize event data inside of a true scale simulation of the detec- tion environment both under and above the ice in a CAVE. In re- cent years, researchers simulated the Super-Kamiokande (SUPER- K) neutrino detector in a CAVE environment [6]. The IceCube en- vironment differs from the Super-K detector in that it is about 25 times the size of the Super-K detector, resulting in a sparser layout of individual detection modules. 2 APPLICATION The application runs in a C6 CAVE at a resolution of 1920x1920 pixels per wall with Intersense IS900 head tracking and a tracked MicroTrax wand 3D input device for interaction and navigation. The system is equipped with a 5.1 stereo surround sound system. The application runs on the Syzygy VR framework and was devel- oped using C++, OpenGL, and GLSL [9]. At the surface of the ice, the application renders 162 IceTop tanks that hold two DOMs each. This surface array allows researchers to detect showers of secondary particles generated by cosmic ray * e-mail: [email protected] e-mail: [email protected] interactions with the atmosphere. The application also renders a model of the IceCube Laboratory (ICL), plays a wind sound effect, and renders a snowy terrain. Underneath the ice, DOMs are held on cables called “strings”, with each string holding 60 DOMs at 17 meter vertical separation and 125-150 meter horizontal separation between strings. The IceCube team provided a highly detailed 3D model of a DOM that has over 600,000 vertices and over 1.2 million polygons. To render this model 5,160 times, the application uses a level-of-detail (LOD) approximation [4]. The team developed four simplified versions of the original model. Each DOM model is loaded once into memory and the two highest polygon versions are drawn with GLSL phong shading. To further improve rendering performance, the system frustum culls the instanced DOM models. The application’s 3D user interface (UI) uses a floating adapted 2D menu interface combined with 3D selection icons and a timeline playback interface [2]. At start-up, the 3D UI draws four 3D icons horizontally arranged in front of the tracked wand. Pressing one of four buttons across the top of the wand, selects the correspond- ing 3D icon causing one of four display panels to appear two feet from the tip of the wand. The navigation and event selection panel provides buttons for quick movement to preset locations and allows selection of a current event file. The information panel updates con- tinuously and shows event playback statistics including position, charge, and time, plus information about the five closest DOMs to the user, including string number, module number and depth. The options panel provides toggles for event sphere fade-out, a post- processing glow feature, drawing of strings, sound playback and also contains a slider for movement speed. The help panel provides a graphic of the wand and icon layout with a key for what different 3D icons represent. The orientation of the panels match the yaw of the wand with pitch and roll not considered. The active panel can be closed by pressing the same button used to open the panel, and doing so returns the UI to drawing the 3D icons. The menu is head-referenced so it follows the user during navigation, and due to the short distance between wand and menu, the application uses ray-casting for easy interaction with menu panels [1]. An additional feature of the 3D UI is an ability for playing events forward or backward in time. Pressing the joystick button switches the 3D panel icons to a familiar playback representation with icons allowing the user to play an event forward or backward, or fast forward or rewind an event. Upon playing an event forward or backward, a tick mark moves along a playback timeline, and the associated play icon will switch to a pause icon. If playback is paused, the previous fast forward and reverse icons switch to “step forward” and “step back” for moving playback in one millisecond increments. Also when paused, holding the trigger button of the wand activates a “scrubbing” interface, which increases the length of the playback timeline to five feet and expands it forward in 3D space two feet. Rotating the wand on its vertical axis adjusts play- back time, updating the event display accordingly. Figure 1 shows the playback timeline in its non-expanded state. The visualization portrays an event as a sequence of colored and scaled spheres within the environment. A single detection within an event is represented by a five-tuple: x,y and z position in meters, which coincide with a DOM location, a charge value representing the light detection, and time in nanoseconds. Sphere diameter is mapped on a logarithmic scale to the event’s charge value. The 117 IEEE Virtual Reality 2014 29 March - 2 April, Minneapolis, Minnesota, USA 978-1-4799-2871-2/14/$31.00 ©2014 IEEE

Upload: james

Post on 14-Mar-2017

217 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: [IEEE 2014 IEEE Virtual Reality (VR) - Minneapolis, MN, USA (2014.03.29-2014.04.2)] 2014 IEEE Virtual Reality (VR) - CAVE visualization of the IceCube neutrino detector

CAVE Visualization of the IceCube Neutrino DetectorRoss Tredinnick 1,2∗ James Vanderheiden 1 Clayton Suplinski 1 James Madsen 3,4†

1University of Wisconsin - Madison 2Wisconsin Institute for Discovery 3University of Wisconsin - River Falls 4WIPAC

ABSTRACT

Neutrinos are nearly massless, weakly interacting particles thatcome from a variety of sources including the sun, radioactive de-cay and cosmic rays. Neutrinos are unique cosmic messengers thatprovide new ways to explore the Universe as well as opportunitiesto better understand the basic building blocks of matter. IceCube,the largest operating neutrino detector in the world, is located in theice sheet at the South Pole. This paper describes an interactive VRapplication for visualization of the IceCube’s neutrino data withina C6 CAVE system. The dynamic display of data in a true scalerecreation of the light sensor system allows events to be viewedfrom arbitrary locations both forward and backward in time. Initialfeedback from user experiences within the system have been pos-itive, showing promise for both further insight into analyzing dataas well as opportunities for physics and neutrino education.

Index Terms: H.5.1 [Information Presentation]: Multimedia In-formation Systems—Artificial, augmented, and virtual realities;

1 INTRODUCTION AND BACKGROUND

IceCube, a recently constructed neutrino detection system consistsof 5,160 optical sensors, called Digital Optical Modules (DOMs),located between 1,450 and 2,450 meters deep in the ice and spreadacross a cubic-kilometer of space. When neutrinos interact withatoms in the ice, Cherenkov radiation produces faint flashes of bluelight whose photons are recorded by a photomultiplier tube withinthe DOM [7]. The IceCube team defines a series of these lightdetections across several DOMs as an “event”. A CAVE applica-tion for visualizing events was attractive to the IceCube team sincetheir data could be analyzed in a true scale environment, giving aperspective unattainable with traditional desktop tools [3]. Further-more, a CAVE application would allow more people to experiencethe detection system. Therefore, the goal of the application is tovisualize event data inside of a true scale simulation of the detec-tion environment both under and above the ice in a CAVE. In re-cent years, researchers simulated the Super-Kamiokande (SUPER-K) neutrino detector in a CAVE environment [6]. The IceCube en-vironment differs from the Super-K detector in that it is about 25times the size of the Super-K detector, resulting in a sparser layoutof individual detection modules.

2 APPLICATION

The application runs in a C6 CAVE at a resolution of 1920x1920pixels per wall with Intersense IS900 head tracking and a trackedMicroTrax wand 3D input device for interaction and navigation.The system is equipped with a 5.1 stereo surround sound system.The application runs on the Syzygy VR framework and was devel-oped using C++, OpenGL, and GLSL [9].

At the surface of the ice, the application renders 162 IceTop tanksthat hold two DOMs each. This surface array allows researchersto detect showers of secondary particles generated by cosmic ray

∗e-mail: [email protected]†e-mail: [email protected]

interactions with the atmosphere. The application also renders amodel of the IceCube Laboratory (ICL), plays a wind sound effect,and renders a snowy terrain. Underneath the ice, DOMs are heldon cables called “strings”, with each string holding 60 DOMs at 17meter vertical separation and 125-150 meter horizontal separationbetween strings. The IceCube team provided a highly detailed 3Dmodel of a DOM that has over 600,000 vertices and over 1.2 millionpolygons. To render this model 5,160 times, the application usesa level-of-detail (LOD) approximation [4]. The team developedfour simplified versions of the original model. Each DOM modelis loaded once into memory and the two highest polygon versionsare drawn with GLSL phong shading. To further improve renderingperformance, the system frustum culls the instanced DOM models.

The application’s 3D user interface (UI) uses a floating adapted2D menu interface combined with 3D selection icons and a timelineplayback interface [2]. At start-up, the 3D UI draws four 3D iconshorizontally arranged in front of the tracked wand. Pressing oneof four buttons across the top of the wand, selects the correspond-ing 3D icon causing one of four display panels to appear two feetfrom the tip of the wand. The navigation and event selection panelprovides buttons for quick movement to preset locations and allowsselection of a current event file. The information panel updates con-tinuously and shows event playback statistics including position,charge, and time, plus information about the five closest DOMs tothe user, including string number, module number and depth. Theoptions panel provides toggles for event sphere fade-out, a post-processing glow feature, drawing of strings, sound playback andalso contains a slider for movement speed. The help panel providesa graphic of the wand and icon layout with a key for what different3D icons represent. The orientation of the panels match the yawof the wand with pitch and roll not considered. The active panelcan be closed by pressing the same button used to open the panel,and doing so returns the UI to drawing the 3D icons. The menu ishead-referenced so it follows the user during navigation, and dueto the short distance between wand and menu, the application usesray-casting for easy interaction with menu panels [1].

An additional feature of the 3D UI is an ability for playing eventsforward or backward in time. Pressing the joystick button switchesthe 3D panel icons to a familiar playback representation with iconsallowing the user to play an event forward or backward, or fastforward or rewind an event. Upon playing an event forward orbackward, a tick mark moves along a playback timeline, and theassociated play icon will switch to a pause icon. If playback ispaused, the previous fast forward and reverse icons switch to “stepforward” and “step back” for moving playback in one millisecondincrements. Also when paused, holding the trigger button of thewand activates a “scrubbing” interface, which increases the lengthof the playback timeline to five feet and expands it forward in 3Dspace two feet. Rotating the wand on its vertical axis adjusts play-back time, updating the event display accordingly. Figure 1 showsthe playback timeline in its non-expanded state.

The visualization portrays an event as a sequence of colored andscaled spheres within the environment. A single detection withinan event is represented by a five-tuple: x,y and z position in meters,which coincide with a DOM location, a charge value representingthe light detection, and time in nanoseconds. Sphere diameter ismapped on a logarithmic scale to the event’s charge value. The

117

IEEE Virtual Reality 201429 March - 2 April, Minneapolis, Minnesota, USA978-1-4799-2871-2/14/$31.00 ©2014 IEEE

Page 2: [IEEE 2014 IEEE Virtual Reality (VR) - Minneapolis, MN, USA (2014.03.29-2014.04.2)] 2014 IEEE Virtual Reality (VR) - CAVE visualization of the IceCube neutrino detector

Figure 1: Example of neutrino detection event visualization within theCAVE. The non-expanded timeline playback widget is shown.

spheres expand over a short time interval when playing forward andlikewise shrink and optionally fade out when playing an event back-ward. The visualization maps time of neutrino detection to a spec-trum approximation color scale, which enhances understanding ofthe direction that a neutrino event passes through the system, sincethe user can examine the resultant color pattern after a sequence hasfinished playing [10]. An example of this visualization is shown inFigure 1. To help further visualize movement direction, a singu-lar value decomposition of the 3D positions within each event filecalculates a vector representing the best fit of event positions. Thesystem uses this vector to draw a 3D red line that expands or retractsthrough the detection system during playback.

By default the user navigates the space via the tracked wand us-ing a “pointing” technique, at a speed of 50 feet per frame [8]. Thepointing technique allows users within the C6 CAVE to view eventplayback around them while navigating, something that a gaze-based navigation would not allow. A second navigation techniqueallows movement to pre-determined locations such as the nearestDOM or the ICL above ice via selecting buttons on the 3D UI. Uponselecting a pre-determined location, the application smoothly inter-polates the users current position to the new location. A third nav-igation technique allows the user to “tour” along a pre-determinedpath of interest [5]. Currently this technique is used to navigatearound the inside and outside of the ICL lab. Each stop on a touroptionally triggers a rendering action that acts as an informationalaid, such as drawing arrows pointing at IceTop tank locations. A fi-nal navigation technique follows the best-fit path of an event duringplayback. This “neutrino view” mode, updates the user’s positioneach frame along the best-fit line, giving the user perspective on thedirection of neutrino travel through the array of DOMs.

The IceCube application uses the FMOD sound library to play3D positional audio cues at the location of events during playbackto enhance the user experience within the CAVE. An event’s chargevalue is mapped to the amplitude of the sound effect playback. The3D sound playback allows the physics researchers to more quicklyidentify the active location of an event within the large environment,especially when first starting to play a new event.

3 FUTURE WORK

Initial verbal feedback from the application has been very positive.User comments have suggested a better understanding of the scaleof the detection system as well as direction of events. Researchersalso made several positive comments about the intuitiveness of the3D UI. A formal user study to evaluate the CAVE visualization re-mains as future work. The development team plans to quantita-tively evaluate the usefulness of the visualization by comparing itto a desktop tool that the research team also uses to visualize data.Some items for comparison will be the ability to identify directionof neutrino events, time to find specific charge values at differentDOM locations, and time to identify neutrino pattern types. Theteam may also compare the CAVE implementation to a dome orHMD implementation.

Besides a user study, next directions for this project fall into twomain areas: seeking to develop more visualization techniques foradditional data and further enhancing the realism of the environ-ment for public relations purposes. For the latter area, more 3Dmodels of IceCube buildings that exist in Antarctica could be addedto the simulation. The application could visualize different ice lay-ers, while telling the story of how each layer affects neutrino de-tection. Improving the visualization could begin by working with agreater variety of the IceCube team’s research data. Other visualiza-tion approaches for the data may prove useful, such as representingthe events as directional glyphs or adapting different color scales forrepresenting time. An ultimate end goal will be to allow real-timeviewing of all detected data from Antarctica inside the CAVE.

4 CONCLUSION

A CAVE application to visualize neutrino events for the Ice-Cube neutrino detection system has been developed for physics re-searchers to better understand the hundreds of gigabytes of data thatis gathered by the detector each day. The application uses a cus-tom 3DUI, sonification, and different navigation and visualizationtechniques to enhance the user experience across the large, cubickilometer environment.

REFERENCES

[1] D. A. Bowman and L. F. Hodges. An evaluation of techniques forgrabbing and manipulating remote objects in immersive virtual envi-ronments. In Proceedings of the 1997 symposium on Interactive 3Dgraphics, pages 35–ff. ACM, 1997.

[2] D. A. Bowman, E. Kruijff, J. J. LaViola Jr, and I. Poupyrev. 3D userinterfaces: theory and practice. Addison-Wesley, 2004.

[3] C. Cruz-Neira, D. J. Sandin, and T. A. DeFanti. Surround-screenprojection-based virtual reality: the design and implementation of thecave. In Proceedings of the 20th annual conference on Computergraphics and interactive techniques, SIGGRAPH ’93, pages 135–142,New York, NY, USA, 1993. ACM.

[4] H. Hoppe. Progressive meshes. In Proceedings of the 23rd annualconference on Computer graphics and interactive techniques, SIG-GRAPH ’96, pages 99–108, New York, NY, USA, 1996. ACM.

[5] J. Ibanez, R. Aylett, and R. Ruiz-Rodarte. Storytelling in virtual envi-ronments from a virtual guide perspective. Virtual Reality, 7(1):30–42,2003.

[6] B. Izatt, K. Scholberg, and R. P. McMahan. Super-kave: An immer-sive visualization tool for neutrino physics. In Poster presented atIEEE Virtual Reality Conference, 2013.

[7] S. R. Klein. Icecube: A cubic kilometer radiation detector. NuclearScience, IEEE Transactions on, 56(3):1141–1147, 2009.

[8] M. Mine et al. Virtual environment interaction techniques. UNCChapel Hill computer science technical report TR95-018, pages507248–2, 1995.

[9] B. Schaeffer and C. Goudeseune. Syzygy: native pc cluster vr. InVirtual Reality, 2003. Proceedings. IEEE, pages 15–22, 2003.

[10] C. Ware. Information visualization: perception for design. Elsevier,2012.

118