People to Watch 2016

Irene Qualters
Director, Division of Advanced Cyberinfrastructure (CISE/ACI)
National Sceince Foundation (NSF)

Irene Qualters is a Program Director within the Office of Cyberinfrastructure of the National Science Foundation (NSF). Her primary focus there is NSF’s HPC projects and solicitations, including such intiatives as Blue Waters, G8 Exascale Software and Petascale Applications.. Prior to joining NSF in 2010, she spent over 25 years in technical and executive leadership roles within industry. Her experience spans startups to wellestablished companies, with focus primarily on HPC Software and Hardware R&D (Cray, SGI) as well the breadth of Cyberinfrastructure for Research Scientists (Merck Research Labs). She has led large and small international teams. Her areas of expertise include parallelism and system architectures. She is particularly interested in innovative technologies and sustainable operating models for HPC.

HPCwire: Hi Irene. Congratulations on being selected as an HPCwire 2016 Person to Watch. As the key driver for efforts in NSF to transform the nature of the “Cyber Infrastructure,” increasing the breadth and depth on science, engineering, research and competitiveness is a key goal you’ve touched on in the past. This is especially critical in HPC, where we have a significant firewall barricading our industry from the public or even the larger scientific community. What do you hope to accomplish toward that end in 2016?

Irene Qualters: Just as the use of technology is transforming our personal lives, the use of “Cyber” is stimulating new possibilities in scientific and engineering research topics, many of which are highly multidisciplinary such as the Innovations at the Nexus of Food, Water and Energy Security (INFEWS) or the national Brain Initiative. It is intuitive that topics such as these will be both data and computationally intensive, linking instruments with computing in more dynamic and cohesive workflows.   Less intuitive perhaps is that exploration of both research and the potential for societal benefit will optimally advance if there is increased interaction, interoperability and sharing among public, private and commercial cyberinfrastructure. Secure access to distributed data, easy access to a range of potentially disruptive generations of HPC at scale, availability of software which is highly adaptable, innovative, and reusable over time, are key investment strategies.   And, of course, none of this is possible without an effort in building, sharing and sustaining expertise.

Reimagining HPC in this context will both challenge and energize our community. Universities, government agencies and the private sector each have a significant and ongoing role in creating and realizing such a vision. NSF’s unique vantage point and contribution comes from the diversity of the science and engineering research that it supports, from its emphasis on integration of education and research, and NSF’s prominence in supporting foundational research in computing science and engineering. Equally important is our enduring engagement with academic institutions to foster innovation and industry partnerships.

As a complement to our role in NSCI, and with community input, we are taking an assessment of what has been accomplished via existing strategies, initiatives and programs such as Cyberinfrastructure for the 21st Century (CIF21) with an eye to a more far-reaching future.   We have engaged the National Academy of Science (NAS) to help us anticipate possible HPC scenarios. And our NSF Advisory Committee for Cyberinfrastructure (ACCI) is providing guidance and perspective.

HPCwire: In addition to overseeing projects such as Blue Waters and XSEDE, you’re also a primary lead for NSF’s co-leadership role in the National Strategic Computing Initiative. What can we be excited for that should be coming out of NSCI in the coming year?

NSCI presents an opportunity for HPC to redefine itself with the help of researchers from industry, government and academia whose advances are constrained by the limits of today’s HPC. These limitations may have originated in the end of Moore’s law but, over time, HPC has become less prominent in the overall technology landscape and today’s limitations are equally associated with considerations such as access, software usability, innovation in methods and algorithms, or in a requirement for unique skillsets.   Each agency brings its own mission focus and individual strengths.   And we are fortunate to be in time of strong technology innovation in the private sector. NSCI has fostered a much greater shared understanding and a collective resolve among the participating agencies.

NSF has begun and will continue to engage the many academic research and cyberinfrastructure communities to identify scientific and engineering frontiers which can only be explored with additional capabilities beyond today’s HPC platforms and today’s scientific applications.     These engagements include our Advisory Committee on Cyberinfrastructure (ACCI), university campuses as well as large facilities and instruments.   We are also beginning to “recapitalize” existing HPC investments with an eye toward the future and considering the rich diversity of approaches that can be encompassed within HPC. These plans will be more visible in 2016.   Further, we expect the final report by the National Research Council Study: Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020, to inform our support for exploration of much more innovative and sustainable operating models for HPC in a dynamic landscape. And finally, we expect to initiate a follow-on activity to the NSF-wide CIF21 initiative that is more specifically data-focused and complementary to NSCI.

HPCwire: Generally speaking, on the subject of high performance computing, what do you see as the most important trends for 2016 that will have an impact now and into the future?

The maturing of and the use of data science is a profound and potentially unifying trend for HPC use in industry, government, and academic research.   How we respond and anticipate will say a lot about HPC relevance and contribution to the future.   Data science exists at the interface of several disciplines, is critical to fundamental understanding of complex systems and is necessarily computationally intensive.   It has relevance to both society and to research. It is attracting many in tomorrow’s workforce.  We in HPC have the chance to become more active participants in a larger conversation, unifying modeling and simulation platforms with data science platforms.   There are both individuals and funded projects that explore and advocate for this participation today.   I think we’ll see more in 2016.

HPCwire: Outside of the professional sphere, what can you tell us about yourself – personal life, family, background, hobbies, etc.?

As a Minnesotan I enjoy the outdoors whether winter or summer or the few weeks in between.   I have a special affinity for native prairies and other wild places.    I am very fortunate to have wonderful, eclectic friends and family.

HPCwire: Final question: What can you share about yourself that you think your colleagues would be surprised to learn?

Together with a friend I once flew a 1941 Boeing Stearman open cockpit biplane from Minnesota to Nevada.   It was magical. One could understand how Lindbergh and others fell in love with flying in the early days.

 

Toni Collis
Women in HPC
Jim Ganthier
Dell
Sumit Gupta
IBM
Dr. Yutong Lu
NUDT
Bill Mannel
HPE
Hartmut Neven
Google
William “Tim” Polk
OSTP
Irene Qualters
NSF
Thomas Sohmers
REX Computing
John West
TACC
SC16 Chair
Kathy Yelick
LBL

 

Leading Solution Providers

Contributors

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

HPCwire