Sept. 15 — University of Illinois at Urbana-Champaign researchers, Professor Wendy K. Tam Cho and Yan Y. Liu recently won 1st place in the Common Cause 2016 First Amendment Gerrymander Standard Writing Competition with their proposal of a novel method for identifying partisan gerrymandering, which they created using the Blue Waters supercomputer at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign.
Partisan gerrymandering is the drawing of electoral district lines in a manner that discriminates against a political party. In his 2016 State of the Union address, President Obama called on lawmakers and the public to take a number of steps to change the current redistricting systems. “We have to end the practice of drawing our congressional districts so that politicians can pick their voters, and not the other way around,” said President Obama.
Cho recounts that while the Supreme Court has also expressed considerable interest in curtailing partisan gerrymandering, they have lacked the tools to “assess how much unfairness is in a map.” Cho and Liu have used Blue Waters with other research projects, and knew that the solution to identifying partisan gerrymanders rests with high performance computing.
Their idea was sparked by a long-standing discussion in redistricting circles about creating the set of all possible maps. Having such a comparison set provides the proper context for judging a disputed partisan gerrymander. However, the size of this set is astronomically and prohibitively large. Cho says that “People have always liked that idea but the ability to draw all possible maps is not possible.”
Cho and Liu delved into the substantive theoretic justifications and devised a path that dramatically shrinks the number of necessary maps. They then developed a massively parallel algorithm with runtime collaboration among problem-solving processes which utilizes Blue Waters to drastically increase the ability to create a very large number of maps that satisfy specified criteria.
“We drew 800 million maps … no one has drawn more than about 10,000, and usually the number is about 1,000. We create orders of magnitude more maps which enables the type of statistical analysis the Court needs to explore political gerrymanders in context. Our maps are not just greater in number, they are also of much higher quality than anyone has ever produced.”
Moving forward with the research, Cho and Liu hope to put further work into the algorithm, tailoring it closely to Supreme Court mandates, increasing its efficiency, and making it applicable to an even wider array of political scenarios, as this research is ongoing and maps are in constant flux. Their research is an embodiment of the computational tools that the legal system needs in order to determine a workable standard for regulating partisan gerrymandering, which the courts have been trying to do for decades.
Researchers can develop tools, but ultimately, only the Court can adopt the use of these tools as the legal standard. All the same, by providing the legal community with the computational tools that are necessary for policing gerrymandering, Cho and Liu’s work is a step toward the future where high performance computing will have an increasing influence in society and governance.
“We spend a lot of time thinking about computing for a host of different problems, but not generally in social science. Social science has been slow getting into the computational realm, especially high performance computing, but there are many ways we can use computing to improve society by, for instance, integrating these technological advances with our democratic ideals.”
Source: Hannah Remmert, NCSA