Gartner Identifies Top 10 Strategic Technologies for 2008

By Nicole Hemsoth

October 15, 2007

Gartner Inc. analysts highlighted the top 10 technologies and trends that will be strategic for most organizations. The analysts presented their findings during Gartner Symposium/ITxpo.

Gartner defines a strategic technology as one with the potential for significant impact on the enterprise in the next three years. Factors that denote significant impact include a high potential for disruption to IT or the business, the need for a major dollar investment, or the risk of being late to adopt.

Companies should factor these technologies into their strategic planning process by asking key questions and making deliberate decisions about them during the next two years, said David Cearley, vice president and distinguished analyst at Gartner. Sometimes the decision will be to do nothing with a particular technology. In other cases it will be to continue investing in the technology at the current rate. In still other cases, the decision may be to test/pilot or more aggressively adopt/deploy the technology. The important thing is to ask the question and proactively plan.

The top 10 strategic technologies for 2008 include:

  • Green IT — The focus of Green IT that came to the forefront in 2007 will accelerate and expand in 2008. Consider potential regulations and have alternative plans for data center and capacity growth. Regulations are multiplying and have the potential to seriously constrain companies in building data centers, as the impact on power grids, carbon emissions from increased use and other environmental impacts are under scrutiny. Some companies are emphasizing their social responsibility behavior, which might result in vendor preferences and policies that affect IT decisions. Scheduling decisions for workloads on servers will begin to consider power efficiency as a key placement attribute.
  • Unified communications — Today, 20 percent of the installed base with PBX has migrated to IP telephony, but more than 80 percent are already doing trials of some form. Gartner analysts expect the next three years to be the point at which the majority of companies implement this, the first major change in voice communications since the digital PBX and cellular phone changes in the 1970s and 1980s.
  • Business process modeling — Top-level process services must be defined jointly by a set of roles (which include enterprise architects, senior developers, process architects and/or process analysts). Some of those roles sit in a service oriented architecture center of excellence, some in a process center of excellence and some in both. The strategic imperative for 2008 is to bring these groups together. Gartner expects BPM suites to fill a critical role as a compliment to SOA development.
  • Metadata management — Through 2010, organizations implementing both customer data integration and product integration and product information management will link these master data management initiatives as part of an overall enterprise information management (EIM) strategy. Metadata management is a critical part of a companys information infrastructure. It enables optimization, abstraction and semantic reconciliation of metadata to support reuse, consistency, integrity and shareability. Metadata management also extends into SOA projects with service registries and application development repositories. Metadata also plays a role in operations management with CMDB initiatives.
  • Virtualization 2.0 — Virtualization technologies can improve IT resource utilization and increase the flexibility needed to adapt to changing requirements and workloads. However, by themselves, virtualization technologies are simply enablers that help broader improvements in infrastructure cost reduction, flexibility and resiliency. With the addition of automation technologies with service-level, policy-based active management resource efficiency can improve dramatically, flexibility can become automatic based on requirements, and services can be managed holistically, ensuring high levels of resiliency. Virtualization plus service-level, policy-based automation constitutes an RTI.
  • Mashup and composite apps — By 2010, Web mashups will be the dominant model (80 percent) for the creation of composite enterprise applications. Mashup technologies will evolve significantly over the next five years, and application leaders must take this evolution into account when evaluating the impact of mashups and in formulating an enterprise mashup strategy.
  • Web platform and WOA — Software as a service (SaaS) is becoming a viable option in more markets and companies must evaluate where service based delivery may provide value in 2008-2010. Meanwhile Web platforms are emerging which provide service-based access to infrastructure services, information, applications, and business processes through Web based cloud computing environments. Companies must also look beyond SaaS to examine how Web platforms will impact their business in three to five years.
  • Computing fabric — A computing fabric is the evolution of server design beyond the interim stage, blade servers, that exists today. The next step in this progression is the introduction of technology to allow several blades to be merged operationally over the fabric, operating as a larger single system image that is the sum of the components from those blades. The fabric-based server of the future will treat memory, processors, and I/O cards as components in a pool, combining and recombining them into particular arrangements to suits the owners needs. For example a large server can be created by combining 32 processors and a number of memory modules from the pool, operating together over the fabric to appear to an operating system as a single fixed server.
  • Real world Web — The term real world Web is informal, referring to places where information from the Web is applied to the particular location, activity or context in the real world. It is intended to augment the reality that a user faces, not to replace it as in virtual worlds. It is used in real-time based on the real world situation, not prepared in advance for consumption at specific times or researched after the events have occurred. For example in navigation, a printed list of directions from the Web do not react to changes, but a GPS navigation unit provides real-time directions that react to events and movements; the latter case is akin to the real-world Web of augmented reality. Now is the time to seek out new applications, new revenue streams and improvements to business process that can come from augmenting the world at the right time, place or situation.
  • Social software — Through 2010, the enterprise Web 2.0 product environment will experience considerable flux with continued product innovation and new entrants, including start-ups, large vendors and traditional collaboration vendors. Expect significant consolidation as competitors strive to deliver robust Web 2.0 offerings to the enterprise. Nevertheless social software technologies will increasingly be brought into the enterprise to augment traditional collaboration.

These 10 opportunities should be considered in conjunction with many proven, fully-matured technologies, as we as others that did not make this list, but can provide value for many companies, said Carl Claunch, vice president and distinguished analyst at Gartner. For example, real-time enterprises providing advanced devices for a mobile workforce will consider next-generation smartphones to be a key technology, in addition to the value that this list might offer.

About Gartner

Gartner Inc. is the worlds leading information technology research and advisory company. Gartner delivers the technology-related insight necessary for our clients to make the right decisions, every day. From CIOs and senior IT leaders in corporations and government agencies, to business leaders in high-tech and telecom enterprises and professional services firms, to technology investors, Gartner is the indispensable partner to 60,000 clients in 10,000 distinct organizations. Through the resources of Gartner Research, Gartner Consulting and Gartner Events, Gartner works with every client to research, analyze and interpret the business of IT within the context of their individual role. Founded in 1979, Gartner is headquartered in Stamford, Conn., and has 3,900 associates, including 1,200 research analysts and consultants in 75 countries. For more information, visit www.gartner.com.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

University of Stuttgart Inaugurates ‘Hawk’ Supercomputer

February 20, 2020

This week, the new “Hawk” supercomputer was inaugurated in a ceremony at the High-Performance Computing Center of the University of Stuttgart (HLRS). Officials, scientists and other stakeholders celebrated the new sy Read more…

By Staff report

US to Triple Its Supercomputing Capacity for Weather and Climate with Two New Crays

February 20, 2020

The blizzard of news around the race for weather and climate supercomputing leadership continues. Just three days after the UK announced a £1.2 billion plan to build the world’s largest weather and climate supercomputer, the U.S. National Oceanic and Atmospheric Administration... Read more…

By Oliver Peckham

Indiana University Researchers Use Supercomputing to Model the State’s Largest Watershed

February 20, 2020

With water stressors on the rise, understanding and protecting water supplies is more important than ever. Now, a team of researchers from Indiana University has created a new climate change data portal to help Indianans Read more…

By Staff report

TACC – Supporting Portable, Reproducible, Computational Science with Containers

February 20, 2020

Researchers who use supercomputers for science typically don't limit themselves to one system. They move their projects to whatever resources are available, often using many different systems simultaneously, in their lab Read more…

By Aaron Dubrow

China Researchers Set Distance Record in Quantum Memory Entanglement

February 20, 2020

Efforts to develop the necessary capabilities for building a practical ‘quantum-based’ internet have been ongoing for years. One of the biggest challenges is being able to maintain and manage entanglement of remote q Read more…

By John Russell

AWS Solution Channel

Challenging the barriers to High Performance Computing in the Cloud

Cloud computing helps democratize High Performance Computing by placing powerful computational capabilities in the hands of more researchers, engineers, and organizations who may lack access to sufficient on-premises infrastructure. Read more…

IBM Accelerated Insights

Intelligent HPC – Keeping Hard Work at Bay(es)

Since the dawn of time, humans have looked for ways to make their lives easier. Over the centuries human ingenuity has given us inventions such as the wheel and simple machines – which help greatly with tasks that would otherwise be extremely laborious. Read more…

New Algorithm Allows PCs to Challenge HPC in Weather Forecasting

February 19, 2020

Accurate weather forecasting has, by and large, been situated squarely in the domain of high-performance computing – just this week, the UK announced a nearly $1.6 billion investment in the world’s largest supercompu Read more…

By Oliver Peckham

US to Triple Its Supercomputing Capacity for Weather and Climate with Two New Crays

February 20, 2020

The blizzard of news around the race for weather and climate supercomputing leadership continues. Just three days after the UK announced a £1.2 billion plan to build the world’s largest weather and climate supercomputer, the U.S. National Oceanic and Atmospheric Administration... Read more…

By Oliver Peckham

Japan’s AIST Benchmarks Intel Optane; Cites Benefit for HPC and AI

February 19, 2020

Last April Intel released its Optane Data Center Persistent Memory Module (DCPMM) – byte addressable nonvolatile memory – to increase main memory capacity a Read more…

By John Russell

UK Announces £1.2 Billion Weather and Climate Supercomputer

February 19, 2020

While the planet is heating up, so is the race for global leadership in weather and climate computing. In a bombshell announcement, the UK government revealed p Read more…

By Oliver Peckham

The Massive GPU Cloudburst Experiment Plays a Smaller, More Productive Encore

February 13, 2020

In November, researchers at the San Diego Supercomputer Center (SDSC) and the IceCube Particle Astrophysics Center (WIPAC) set out to break the internet – or Read more…

By Oliver Peckham

Eni to Retake Industry HPC Crown with Launch of HPC5

February 12, 2020

With the launch of its Dell-built HPC5 system, Italian energy company Eni regains its position atop the industrial supercomputing leaderboard. At 52-petaflops p Read more…

By Tiffany Trader

Trump Budget Proposal Again Slashes Science Spending

February 11, 2020

President Donald Trump’s FY2021 U.S. Budget, submitted to Congress this week, again slashes science spending. It’s a $4.8 trillion statement of priorities, Read more…

By John Russell

Policy: Republicans Eye Bigger Science Budgets; NSF Celebrates 70th, Names Idea Machine Winners

February 5, 2020

It’s a busy week for science policy. Yesterday, the National Science Foundation announced winners of its 2026 Idea Machine contest seeking directions for futu Read more…

By John Russell

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

Julia Programming’s Dramatic Rise in HPC and Elsewhere

January 14, 2020

Back in 2012 a paper by four computer scientists including Alan Edelman of MIT introduced Julia, A Fast Dynamic Language for Technical Computing. At the time, t Read more…

By John Russell

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

SC19: IBM Changes Its HPC-AI Game Plan

November 25, 2019

It’s probably fair to say IBM is known for big bets. Summit supercomputer – a big win. Red Hat acquisition – looking like a big win. OpenPOWER and Power processors – jury’s out? At SC19, long-time IBMer Dave Turek sketched out a different kind of bet for Big Blue – a small ball strategy, if you’ll forgive the baseball analogy... Read more…

By John Russell

Intel Debuts New GPU – Ponte Vecchio – and Outlines Aspirations for oneAPI

November 17, 2019

Intel today revealed a few more details about its forthcoming Xe line of GPUs – the top SKU is named Ponte Vecchio and will be used in Aurora, the first plann Read more…

By John Russell

IBM Unveils Latest Achievements in AI Hardware

December 13, 2019

“The increased capabilities of contemporary AI models provide unprecedented recognition accuracy, but often at the expense of larger computational and energet Read more…

By Oliver Peckham

SC19: Welcome to Denver

November 17, 2019

A significant swath of the HPC community has come to Denver for SC19, which began today (Sunday) with a rich technical program. As is customary, the ribbon cutt Read more…

By Tiffany Trader

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

51,000 Cloud GPUs Converge to Power Neutrino Discovery at the South Pole

November 22, 2019

At the dead center of the South Pole, thousands of sensors spanning a cubic kilometer are buried thousands of meters beneath the ice. The sensors are part of Ic Read more…

By Oliver Peckham

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Jensen Huang’s SC19 – Fast Cars, a Strong Arm, and Aiming for the Cloud(s)

November 20, 2019

We’ve come to expect Nvidia CEO Jensen Huang’s annual SC keynote to contain stunning graphics and lively bravado (with plenty of examples) in support of GPU Read more…

By John Russell

Top500: US Maintains Performance Lead; Arm Tops Green500

November 18, 2019

The 54th Top500, revealed today at SC19, is a familiar list: the U.S. Summit (ORNL) and Sierra (LLNL) machines, offering 148.6 and 94.6 petaflops respectively, Read more…

By Tiffany Trader

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed instances for storage workloads. The fourth-generation Azure D-series and E-series virtual machines previewed at the Rome launch in August are now generally available. Read more…

By Tiffany Trader

Intel’s New Hyderabad Design Center Targets Exascale Era Technologies

December 3, 2019

Intel's Raja Koduri was in India this week to help launch a new 300,000 square foot design and engineering center in Hyderabad, which will focus on advanced com Read more…

By Tiffany Trader

In Memoriam: Steve Tuecke, Globus Co-founder

November 4, 2019

HPCwire is deeply saddened to report that Steve Tuecke, longtime scientist at Argonne National Lab and University of Chicago, has passed away at age 52. Tuecke Read more…

By Tiffany Trader

IBM Debuts IC922 Power Server for AI Inferencing and Data Management

January 28, 2020

IBM today launched a Power9-based inference server – the IC922 – that features up to six Nvidia T4 GPUs, PCIe Gen 4 and OpenCAPI connectivity, and can accom Read more…

By John Russell

Cray Debuts ClusterStor E1000 Finishing Remake of Portfolio for ‘Exascale Era’

October 30, 2019

Cray, now owned by HPE, today introduced the ClusterStor E1000 storage platform, which leverages Cray software and mixes hard disk drives (HDD) and flash memory Read more…

By John Russell

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This