IBM Awards Grant to Univ. of Texas for Grid Project

By Nicole Hemsoth

February 28, 2005

IBM announced that the University of Texas has joined IBM's Academic Initiative to help better prepare students for the information technology (IT) and computer science jobs of tomorrow.  IBM has chosen UT to participate in the program, which it is also making available to select schools around the country.

The announcement was made at the IBM Austin Center for Advanced Studies' Sixth Annual Conference where IBM supported researchers from more than 20 universities worldwide presented their innovative techniques and cutting-edge research results in the areas of hardware, software, systems technology and business management.

The university will collaborate with IBM on several levels, including skills- building, curriculum development, and academic research and recruitment. This initiative will expand upon ongoing partnerships that IBM has established with the University of Texas which already are achieving results in Austin. In fact, UT is among IBM's top schools for recruitment, a testament to the caliber of education students are receiving.

IBM also announced a new Shared University Research (SUR) grant awarded to UT to support a second phase of research on the UT Grid project which will secure two IBM employees to work full-time on Phase II of the project with UT.

Led by the Texas Advanced Computing Center (TACC), the goal of the UT Grid project is to facilitate advanced research and new educational applications by developing and deploying software technologies to integrate the diverse computational, storage, visualization, data and instrument resources of The University of Texas. The project evaluates and uses existing distributed and Grid computing technologies and develops additional software to integrate these resources. A major thrust of the project is to integrate from “personal- scale to terascale” to provide software that links people's laptops and desktops directly into the campus Grid and facilitates easier usage of the distributed large-scale resources around the campus.

The second phase of the project will focus on expanding the Grid and extending the applications deployed on the Grid to new areas such as oil exploration using visualization of seismic data. This SUR grant is part of the latest series of Shared University Research (SUR) awards, bringing IBM's contributions to foster collaborative research to more than $75 million over the last three years.  

“Our relationship with IBM is firing on all cylinders. Every year UT computer sciences faculty members receive IBM awards, which help them start innovative research programs in architecture, compilation, networking, AI, formal methods, and others”, said J. Moore, chairman of the computer sciences department at the University of Texas. “IBM has long been one of the most active recruiters of our graduates and we work closely with IBM on several major projects, including IBM's PERCS project and our own TRIPS project — a revolutionary new microprocessor architecture designed to help industry stay on Moore's curve.”  

“The number of people training in computer sciences is dropping nationally even as the US Department of Commerce projects that science and engineering job growth will be largest in the IT sector,” continued Moore. “IBM's pro- active Academic Initiative is an excellent example of an industry-academic alliance to help solve a major problem for the State and Nation.”

The IBM Academic Initiative is an innovative program offering a wide range of technology education benefits to meet the goals of most colleges and universities. IBM will work with schools that support open standards and seek to use open source and IBM technologies for teaching purposes both directly and virtually via the Web.

As part of the Academic Initiative, IBM will work with select schools that support open standards to achieve three key objectives:

  • Training an IT workforce to fill the new kinds of jobs that are emerging at IBM and across the industry.
  • Providing the right skills to the next generation of IT workers to ensure they are qualified for the jobs of tomorrow.
  • Ensuring that universities have the most current, relevant curricula that map to the kinds of jobs that are expected, so schools can be attractive for enrollment, funding and growth.

“We must help ensure that the students of today are prepared to be the technology leaders of tomorrow,” said Margaret Ashida, director of corporate ecruitment at IBM. “As new high-value, high-paying jobs continue to emerge, we are pleased to be working with UT in our mutual commitment to fill the skill pipeline. Through the IBM Academic Initiative, UT can infuse open technology throughout their IT curriculum and provide their students with the relevant skills, training and open standards knowledge so they can succeed.”

In an increasingly competitive global economy, the IT leaders of tomorrow will be pursuing innovations that will come from a fusion of several different disciplines. IBM, which champions open standards as the technology of choice for independent software vendors (ISVs), the leading influencers of today's marketplace, now seeks to advance open standards among the next generation of IT professionals. At the same time it is helping reverse a troubling trend, the lack of enough qualified science and technology students with skills to lead the future of the IT industry.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire