Close

Jung Who Nam

PhD Graduate - Computer Science

Download Resume

Jung Who Nam

jungwhonam@gmail.com

About Me

For over 8 years, I have been designing and developing 3D user interaction techniques to assist experts in various domains with exploring and presenting their data.

I began researching virtual reality and interactive data visualization at IVLab, and completed my Ph.D. focusing on techniques for making 3D user interface techniques more accessible to scientists and the public. From 2019 to 2022, I was building interactive installations for museums in South Korea to provide more engaging ways to explore historical data. I am currently a postdoctoral fellow at the Texas Advanced Computing Center, where I work on extending Intel's raytracing application to support immersive virtual reality experiences.

Below, you can take a look at several of the projects I've developed throughout my research career.

Education

University of Minnesota, Twin Cities

2014 – July 2022

Ph.D. in Computer Science

University of Minnesota, Twin Cities

2012 – 2014

M.S. in Computer Science

University of Minnesota, Twin Cities

2008 – 2012

B.S. in Computer Science

Publications


  • PEARC 2023

Immersive OSPRay: Enabling VR Experiences with OSPRay

Jung Who Nam, Gregory D. Abram, Francesca Samsel, and Paul A. Navrátil

ACM PEARC 2023, Portland, USA. (DOI: 10.1145/3569951.3597579)


  • TVCG

V-Mail: 3D-Enabled Correspondence about Spatial Data on (Almost) All Your Devices

Jung Who Nam, Tobias Isenberg, and Daniel F. Keefe

IEEE Transactions on Visualization and Computer Graphics, 2022. (DOI: 10.1109/TVCG.2022.3229017)


  • Book chapter

Hybrid Data Constructs: Interacting with Biomedical Data in Augmented Spaces

Daniel F. Keefe, Bridger Herman, Jung Who Nam, Daniel Orban, and Seth Johnson

In Making Data: Materializing Digital Information, edited by Ian Gwilt, ch. 11, pp. 169-182, Bloomsbury Visual Arts, June 2022. (DOI: 10.5040/9781350133266.ch-011)


  • VR 2019

Worlds-in-Wedges: Combining WIMs and Portals to Support Comparative Immersive Visualization of Forestry Data

Jung Who Nam, Krista McCullough, Joshua Tveite, Maria M. Espinosa, Charles H. Perry, Barry T. Wilson, and Daniel F. Keefe

IEEE VR 2019, Osaka, Japan. (DOI: 10.1109/VR.2019.8797871)


  • Nature Scientific Reports

Signature Maps for Automatic Identification of Prostate Cancer from Colorimetric Analysis of H&E-and IHC-stained Histopathological Specimen

Ethan Leng, Jonathan C. Henriksen, Anthony E. Rizzardi, Jin Jin, Jung Who Nam, Benjamin M. Brassuer, Andrew D. Johnson, Nicholas P. Reder, Joseph S. Koopmeiners, Stephen C. Schmechel, and Gregory J. Metzger

Nature Scientific Reports, vol. 9, no. 6992, May 2019. (DOI: 10.1038/s41598-019-43486-y)


  • SciVis 2019 posters

  • SciVis Best Poster Award

Linked View Visualization Using Clipboard-Style Mobile VR: Application to Communicating Forestry Data

Jung Who Nam, Charles H. Perry, Barry T. Wilson, and Daniel F. Keefe

IEEE VIS 2019, Vancouver, Canada.


  • VRST 2019 posters

Effects of Age and Motivation for Visiting on AR Museum Experiences

Narae Park, Yohan Hong, Hyunjeong Pak, Jung Who Nam, Kyoungsu Kim, Junbom Pyo, Kyungwon Gil, and Kyoobin Lee

ACM VRST 2019, Sydney, Australia. (DOI: 10.1145/3359996.3364711)


  • Leonardo

Spatial Correlation: An Interactive Display of Virtual Gesture Sculpture

Jung Who Nam and Daniel F. Keefe

Leonardo, vol. 50, no. 1, pp. 94–95, Feb 2017. (DOI: 10.1162/LEON_a_01226)


  • Nature Scientific Reports

Microstructure Imaging of Crossing (MIX) White Matter Fibers from diffusion MRI

Hamza Farooq, Junqian Xu, Jung Who Nam, Daniel F. Keefe, Essa Yacoub, Tryphon Georgiou, and Christophe Lenglet

Nature Scientific Reports, vol. 6, no. 38927, Dec 2016. (DOI: 10.1038/srep38927)


  • Radiology

Detection of Prostate Cancer: Quantitative Multiparametric MR Imaging Models Developed Using Registered Correlative Histopathology

Gregory J. Metzger , Chaitanya Kalavagunta, Benjamin Spilseth, Patrick J. Bolan, Xiufeng Li, Diane Hutter, Jung Who Nam, Andrew D. Johnson, Jonathan C. Henriksen, Laura Moench, Badrinath Konety, Christopher A. Warlick, Stephen C. Schmechel, and Joseph S. Koopmeiners

Radiology, vol. 279, no. 3, pp. 805-816, Jan 2016. (DOI: 10.1148/radiol.2015151089)


  • VISAP exhibits

Spatial Correlation: An Interactive Display of Virtual Gesture Sculpture

Jung Who Nam and Daniel F. Keefe

IEEE VIS 2014 Arts Program, Paris, France.

Experience

University of Texas - Austin, TX

Postdoctoral Researcher, Texas Advanced Computing Center (TACC)

At the visitor center, there are high-res tiled displays that show images and videos from visualization projects that researchers at the center have worked on. My task is to bring interactive 3D content into the systems to provide visitors with engaging methods to look at ongoing work. Working with software engineers at Intel and research scientists at TACC, I upgraded Intel's raytracing application to display a single, coherent 3D virtual environment on the tiled displays and added support for gesture-based interaction. We created a proof-of-concept prototype that enables a user to move around a 3D virtual environment by lifting both hands - pretending to be a bird - and leaning the body to fly in that direction.

Additionally, for this year, I will implement voice-control capabilities in the application so that users can move around a 3D environment using spoken words. We plan to leverage recent advances in deep learning models to develop this interactive system. Our initial prototype will provide an example of AI agents used for real-time applications such as data visualizations and provide opportunities to test current deep learning models for verbally interacting with 3D content.


Collaborative Results

Gwangju Institute of Science and Technology - South Korea

Research Engineer, Korea Culture and Technology Institute (KCTI)

During my stay in South Korea, I worked in a lab that works closely with museums to support new generations of public exhibitions. My task was to develop interactive installations for museums by closely working with experts from other fields, e.g., graphic designers, data curators, and historians. I developed visualization and interaction techniques for use by museum visitors to explore museums' archived data using gesture-based interaction. To ease the process of integrating assets created by designers and data curators, I implemented features to load the assets and populate 3D scenes and GUIs. By refactoring the code, I also provided a codebase for gesture-based interaction in Unity, which enabled another developer to create separate interactive applications. During my stay, I helped create two interactive installations presented at public venues (each lasted about a week).


Collaborative Results

  • "The Road of Hyecho" - Interactive installation at Gwangju Cultural Foundation (news)
  • "The Road of Ramayana" - Interactive installation at Asia Culture Center (video, news, news)
  • "Effects of Age and Motivation for Visiting on AR Museum Experiences" (10.1145/3359996.3364711)

University of Minnesota - Twin Cities, Minneapolis, MN

Research Assistant, Interactive Visualization Lab (IVLab)

During my Ph.D., I worked with experts from other fields, e.g., geology, medical devices, neuroscience, and health. My task was to create 3D interactive systems to assist these experts with analyzing and presenting their data. Mainly, I worked on creating data-driven 3D virtual environments and integrating VR/AR technologies to enable these experts to immerse in data (and look for findings and confirm hypotheses). Also, focusing on public-facing content, I worked on creating VR solutions to make these technologies accessible to the public for training and education. These works were presented at IEEE VR and VIS conferences.


Collaborative Results

  • "Worlds-in-Wedges: Combining WIMs and Portals to Support Comparative Immersive Visualization of Forestry Data" (10.1109/VR.2019.8797871)
  • "Linked View Visualization Using Clipboard-Style Mobile VR: Application to Communicating Forestry Data"
  • "Hybrid Data Constructs: Interacting with Biomedical Data in Augmented Spaces" (10.5040/9781350133266.ch-011)
  • "Spatial Correlation: An Interactive Display of Virtual Gesture Sculpture" (10.1162/LEON_a_01226)
  • "Microstructure Imaging of Crossing (MIX) White Matter Fibers from diffusion MRI" (10.1038/srep38927)

INRIA - Scalay, France

Research Intern, Analysis and Visualization Lab (AVIZ)

Scientific visualization applications are often complex, requiring substantial expertise in how to use software features. My task was to find ways of facilitating team-science collaboration even when members have different levels of expertise. Our approach is to provide different ways of viewing/interacting with data in one collaborative framework. Data can simply be viewed from a video file created using animation features we developed. More engaged users can load the same video file in a visualization application and thoroughly explore the depicted data. Traveling users can see the video file in our custom video player and sketch or leave comments on data views. With this framework, users can pick and choose a client based on their needs and situations, and importantly, changes made in these clients are made back to the original data, so everyone is in sync. This work was published in TVCG.


Collaborative Results

University of Minnesota - Twin Cities, Minneapolis, MN

Programmer, Center for Magnetic Resonance Research (CMRR)

When a patient is diagnosed with prostate cancer, the organ is taken out of the body. To study cancer, pathologists cut the prostate into slices and further cut it into subsections to be able to scan these tissues with scanning devices. My task was to develop a series of tools to assist pathologists with reconstructing a prostate volume and making further annotations. I helped create a Photoshop-like application that enabled pathologists to stitch back these scanned images and draw cancer boundaries. I also implemented output features to save these annotated slice images to a file format for further data analysis. Researchers in the lab used these annotated data to create models for detecting prostate cancer; their works were published in Nature Scientific Reports and Radiology.


Collaborative Results

  • "Signature Maps for Automatic Identification of Prostate Cancer from Colorimetric Analysis of H&E-and IHC-stained Histopathological Specimen" (10.1038/s41598-019-43486-y)
  • "Detection of Prostate Cancer: Quantitative Multiparametric MR Imaging Models Developed Using Registered Correlative Histopathology" (10.1148/radiol.2015151089)

Skills