Despite decades of funding for cybersecurity efforts, the problem has fundamentally not been addressed, Anita Nikolich of the Illinois Institute of Technology said in a plenary talk at the PEARC18 conference in Pittsburgh, Pa., on July 26. Moving the needle on cybersecurity will require engagement of academic researchers with the hacking community, she argued.
“I’ve seen both sides of this,” said the former Program Director for Cybersecurity at the NSF. “You have these two worlds of professional, academic researchers and nonprofessional researchers. How can we eliminate the boundaries? How can we bring these two together to make the world more secure?”
The annual Practice and Experience in Advanced Research Computing (PEARC) conference—with the theme Seamless Creativity this year—stresses key objectives for those who manage, develop and use advanced research computing throughout the U.S. and the world. This year’s program offered tutorials, plenary and contributed talks, workshops, panels, poster sessions and a visualization showcase.
In her talk, Nikolich cited the commonality of topics and content presented at academic cybersecurity conferences and hacker conferences such as DEFCON; and yet, she argued, the silos are difficult to connect.
One problem, she said, is that the academic funding cycle rewards incremental results but not interdisciplinary work or practical application. “There isn’t as much freedom these days to dabble across topics [and] not very many grand challenges are being solved.”
$80 million spent annually by the NSF alone, in addition to that spent by other government agencies and funding foundations, has done little to produce better cybersecurity, Nikolich argued. Congress has begun pushing for results, but the community hasn’t yet figured out how to produce them.
“Why are there still so many security problems?”
This, despite surprisingly similar work being done by the two communities.
“The agendas at academic security and hacker conferences are suspiciously similar,” she said. “[But] they don’t want to be seen at each other’s conferences.”
For the hacker community, being taken seriously and contributing to the research corpus isn’t just an obstacle. It’s a brick wall. Nikolich cited Jay Radcliffe, who hacked his own insulin pump and discovered it could be reprogrammed remotely—a potentially life-threatening cyberintrusion. His work was good enough to get Johnson & Johnson, which made the pump, to work with him and conduct a recall. But academic publications rejected the paper and its methodology.
Meanwhile, academic researchers can present their work in journals and invitation-only conferences but their results seldom affect the public debate over security issues. In 2017, DEFCON received popular coverage by featuring voting security for the first time—which left academic researchers, who had been publishing on the phenomenon since 2006, wondering why the earlier work had not been noticed.
Academic silos don’t help, Nikolich added.
“Who really owns voting security?” she said. “Proposals don’t generally do well [in funding] because we can’t decide.”
Obstacles to hacker/academic collaboration include inconsistent judgments from Institutional Review Boards that are more familiar with animal or human clinical research; the uncertain legality of many useful hacking techniques and the risk of facing civil or criminal legal action for publicizing vulnerabilities; the lack of funding for “offensive research,” which focuses on discovering vulnerabilities and how to exploit them; and the problematic nature of publishing details about hacks. Even accessing paywalled journal articles can be a big obstacle for unfunded hacker/researchers working in their spare time.
From the academic side, the system incentivizes publishing many incremental papers, acquiring tenure, getting grants and above all not undertaking projects with a high probability of failure. But it does not necessarily reward workable security solutions. This pairs with the inherent difficulty of identifying and engaging trusted hacker partners.
“The [hacker] circuit can be kind of a sideshow if you don’t know how to navigate it.”
And yet some bridges have been formed, according to Nikolich. From 2011 to 13, DARPA offered quick-turnaround micro-grants to hackers, hacker spaces and maker labs. One success story from this program was Charlie Miller’s and Chris Valasek’s car-hacking research, presented at DEFCON in 2015. Among conferences, the Computer Human Interaction (CHI), Association for the Advancement of AI (AAAI) and Digital Shoreditch Festival in London have run successful tracks or workshops bringing the academic and hacker communities together.
Nikolich advocated for “adventurous faculty” to sponsor nonacademics, encourage undergraduates to participate and invite nonacademics to conferences and workshops. On the funding side, government agencies and foundations could fund an “underground component” similar to the DARPA micro-grants. Community advisory boards, crowdsourcing research ideas, and improved clarity around what research is ethical—in particular, rewriting the federal Compute Fraud and Abuse Act to reflect a better understanding of the field—will be necessary as well. In particular, the latter tends to punish individuals who discover vulnerabilities while leaving companies free to ignore them.
Ken Chiacchia is a Senior Science Writer with the Pittsburgh Supercomputing Center.