Your browser is unsupported

We recommend using the latest version of IE11, Edge, Chrome, Firefox or Safari.

Saeed Boorboor joins CS faculty

Saeed Boorboor

This semester, Assistant Professor Saeed Boorboor joined the CS department. His research is centered on immersive visualization, providing color, meaning, and significance to data. He focuses specifically on scientific visualization and broadly on biomedical data.

When faced with an Excel sheet filled with numbers, most people struggle to interpret the data presented, searching for meaning and context. Scientists can amass vast datasets, and visualization systems that allow them to explore, analyze, and interact with scientific data allow them to hone in on salient information. Boorboor uses human-centered design, image processing, computer graphics, and applied AI to investigate novel visualization methods, including hand-held and wearable devices and large display models.

Boorboor is excited to join UIC, the pioneer in immersive electronic visualization and home to the Electronic Visualization Laboratory’s CAVE and CAVE 2 facilities. Prior to joining UIC, Boorboor worked as a principal research scientist at Stony Brook University’s Center for Visual Computing, where he also received his PhD in computer science. He designed and constructed the FlexiCAVE, the world’s first foldable and reconfigurable tiled-wall immersive facility, enabling adaptable spatial configurations for research in immersive analytics and human-computer interaction.

Medical Visualization

At Stony Brook, Boorboor worked on bringing CT scan data to life, translating the electrical output into shapes and colors for the images perceived by a scientist. His immersive visualization techniques allow scientists and doctors to navigate in three-dimensional spaces, on projects ranging from the colon to the brain.

He and a team at Stony Brook created a virtual colonoscopy, a noninvasive, FDA-approved technique that is essentially a CT scan that projects data volumetrically, or in three dimensions. This process could eliminate the need for invasive procedures for most people undergoing this routine diagnostic test. Using a technique envisioned by Boorboor’s PhD advisor, a doctor can“float” through the colon, as if in a submarine. Virtual biopsies of tumors as small as 2 mm can be conducted using the same method as a CT scan to detect tumors.

“Now you have 100% control and view, and it’s safer,” Boorboor said. “In a traditional colonoscopy or endoscopy, you have a camera with attached tools. But the colon has muscular folds, and if you want to see polyps–which are precursors to the cancer–behind a fold, you have to turn the camera very politely so that you don’t perforate the colon.”

Boorboor conducted similar research into virtual pancreatography, creating 3D models to locate where cancer cells are growing and to establish positive margins for tumor removal.

Eventually, Boorboor’s visualization work led to the brain, where he worked with neuroscientists from the National Institute of Health to understand cognitive decline, particularly Alzheimer’s disease.

Brain scans include staggering amounts of data: a one-millimeter cube of a mouse’s brain, for example, contains petabytes of data. Putting this in perspective, the entire Library of Congress contains around 20 petabytes.. Boorboor is taking these microscopy sets and presenting them in a form where they can be queried, so a scientist can zoom in on a particular neurite, tracking where it connects, its density, and how it branches out.

“For Alzheimer’s particularly, my work was using AI to predict what is happening to the neurobiological system that is responsible for cognition,” Boorboor said.

Visualizations to keep people safe

After Superstorm Sandy hit New York a decade ago, Boorboor worked with atmospheric scientists and their superstorm simulation models to develop Submerse, an immersive life-sized flood visualization system. He took data such as sea level rise, wave patterns, and direction of water flow, and overlaid 3D maps of New York City, including traffic and demographics. By tweaking various parameters, emergency managers could view worst-case flooding scenarios.

“What does the flooding look like, is water covering my foot, at my knee, or will it be chest height? And can we pass a fire truck through it, and conduct evacuations?” Boorboor said. “My goal is taking all the scientific data, medical data, simulation data, physics data, and immerse scientists so they have a wholesome view of it.”

Boorboor said in meetings with emergency responders, data showed that the areas that would flood first were more affluent and contained newer buildings. Many of these residents would have the means to evacuate quickly and often had second homes to retreat to. Areas that were flooding later in their simulation contained older buildings and older, less affluent residents who were less likely to be able to evacuate. The researchers brainstormed with emergency managers to create potential evacuation routes and other emergency responses for future storms.

Joining UIC

At UIC, Boorboor is working with EVL faculty and staff to design and develop the next-generation immersive facility, a CAVE 3. Argonne National Laboratory is collaborating with EVL on incorporating big data visualization and plans to explore quantum visualization.

This semester, Boorboor is teaching CS 428, Virtual, Augmented, and Mixed Reality. He is accepting graduate and undergraduate students to work with him in his lab.