Professor, Computer Science
Wildbook is a tech-for-wildlife conservation platform that creates artificial-intelligence solutions to combat species extinction and save the planet’s biodiversity. Wildbook’s technology trains computers to analyze photographs and videos of animal populations all over the world, helping scientists to count species and even follow individual animals based on their unique stripes or spots. It also can be thought of as a way to crowdsource conservation, as the vacation photos and YouTube videos that people upload online are turned into scientific data about animal populations. Wildbook enabled the first in history and the most accurate full census of an entire species—Great Grevy’s Rally—based on photos taken by ordinary citizens, from schoolchildren and rangers to tourists and government officials. The results changed conservation policy for the species in the nation of Kenya.
An overview research paper written by Dr. Berger-Wolf and her research team is online here, and their original paper on animal identification, from 2013, can be found here. In addition, this National Geographic article explains why Wildbook is important.
Professor, Computer Science
A mathematical theory of gerrymandering
Partisan gerrymandering is unarguably a major cause of voter disenfranchisement in United States, but groups have had limited success in persuading U.S. courts to adopt specific measures to quantify the phenomenon. Nicholas Stephanopoulos and Eric McGhee introduced a new way to measure partisan gerrymandering using an “efficiency gap” that computes the absolute difference of wasted votes between two political parties in a two-party system, and a U.S. appeals court utilized this measure to evaluate a claim that the legislative map of the state of Wisconsin was gerrymandered. UIC’s DasGupta and his team explored computational aspects of this measure to provide a simple, fast algorithm that can “un-gerrymander” district maps for Texas, Virginia, Wisconsin, and Pennsylvania by bringing their “efficiency gaps” from unacceptable to acceptable levels. To the best of our knowledge, this algorithm represents the first publicly available implementation and corresponding evaluation of the efficiency-gap measure. The research shows that computational methods could be practically used to determine district maps with acceptable efficiency gaps. Should the U.S. Supreme Court uphold the decision of lower courts with regard to the efficiency-gap approach, the work of Dr. DasGupta’s team will provide crucial support in the effort to resolve partisan gerrymandering.
A research paper by Dr. DasGupta’s research team is available here: Alleviating partisan gerrymandering: can math and computers help to eliminate wasted votes?
Professor and Associate Dean of Research
Web applications are the engines that power modern websites such as Facebook and Amazon. They are valuable targets for security attacks because of their popularity and the sensitive data that they handle. This has potentially catastrophic implications in terms of financial losses to the business and privacy losses to the consumer. The work of Venkat Venkatakrishnan and his research team represents an effort to secure web applications by automatically generating exploits that validate the existence of vulnerabilities in large, complex, and dynamic web applications. The team has found significant vulnerabilities in web applications such as MediaWiki, the software that hosts Wikipedia; HotCRP, a popular conference management system; and the open-source survey system LimeSurvey, among others. Their ongoing work addresses the problem of finding and fixing these security holes for web application platforms. In the field of cybersecurity, there is an ongoing arms race between “black hat” criminal hackers and “white hat” ethical hackers. Dr. Venkatakrishnan’s work, primarily supported by DARPA and NSF grants, intends to provide fundamental research and technological advances that are needed to tip the balance in favor of ethical hackers.
Read Dr. Venkatakrishnan’s research team’s distinguished paper award-winning submission to the USENIX Security 2018 conference and a paper from the 2016 Conference on Computer and Communications Security in Vienna, Austria.
Barbara Di Eugenio
Improving robot-human communication
Robots that can collaborate with humans in performing everyday physical tasks could provide assistance in a variety of settings, such as doing household chores, supporting the elderly to remain independent, and assisting human workers on the factory floor. To truly participate in collaboration with humans, however, robots must be adept at human interaction, which is rich in signals beyond the spoken word, such as pointing gestures and haptic exchanges that involve touch and force. Only when we can model these extralinguistic signals will we truly be able to develop assistants that can help in the physical world, as opposed to the disembodied likes of Alexa and Siri. In the RoboHelper project, Barbara Di Eugenio and her research team focused on building a multimodal interface for communication between an elderly person and a robot. The team, which includes Milos Zefran from UIC’s electrical and computer engineering department, collected interactions between elderly people and human assistants to determine the ingredients needed for successful person-robot interaction. To date, the team has derived computational models of the roles that spoken words, pointing gestures, and haptic exchanges play in identifying the objects and actions that an assistant needs to perform. Di Eugenio and her collaborators are in the process of deploying these models on an actual robot to demonstrate the feasibility of the approach.
Learn more in this research paper that Di Eugenio and her colleagues published in the journal Computer Speech and Language.