Tag: public cloud
Many IT organizations are seeking a new approach to the data management challenges presented when using multiple clouds. In particular, they want an approach that allows them to leverage public cloud economics while maintaining control of their data. NetApp offers a new approach to hybrid IT by using dedicated, secure, private storage with low-latency access Read more…
Distributed computing has undergone many permutations, from its roots in grid computing to support large scientific endeavors to Sun-style utility computing, to the kind of public cloud computing popularized by Amazon Web Services. One of the newest projects of this type combines cycle sharing, think SETI@home, with cryptocurrency principles. Zennet, as it’s called, is being presented Read more…
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
After a lengthy incubation phase, Microsoft is finally ready to release its IaaS product into the wild. AWS, look out.
<img src=”http://media2.hpcwire.com/hpcwire/test_tube_image_200x.jpg” alt=”” width=”93″ height=”61″ />The top research stories of the week include the 2012 Turing Prize winners; an examination of MIC acceleration in short-range molecular dynamics simulations; a new computer model to help predict the best HIV treatment; the role of atmospheric clouds in climate change models; and more reliable HPC cloud computing.
<img src=”http://media2.hpcwire.com/hpccloud/Yottamine_predictive_analytics_graphic_150x.jpg” alt=”” width=”96″ height=”64″ />Startup Yottamine Analytics is riding the twin waves of cloud and big data. Its cloud-based predictive modeling solution combines the benefits of EC2 spin-up automation and large-scale program parallelism to provide predictive power by the hour for pennies a minute.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpccloud/AWS_reInvent_logo_black_2012_150x.jpg” alt=”” width=”95″ height=”39″ />AWS used its first ever customer and partner conference, AWS re: Invent, held last week in Las Vegas, as a launch pad for some major company news. During their respective keynotes, AWS Senior Vice President Andrew Jassy revealed a brand new data warehouse service, AWS Redshift, and another price cut for the S3 storage service, while Amazon.com CTO Werner Vogels announced two super-sized EC2 Instance Types, and another new service, the AWS Data Pipeline.
In push to compete with rival cloud players, Google announces 36 new instance types, reduced compute and storage costs.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpccloud/eagle_in_the_clouds.jpg” alt=”” width=”92″ height=”75″ />Federal IT departments are faced with some tough challenges these days. Not only are budgets constrained, but mandates are starting to stack up like the tax code. One of the most talked about is the cloud-first mandate, but what kind of cloud will it be?
Between the ubiquity of Internet-connected devices and businesses looking to expand their reach through digital means, it’s becoming nearly impossible to talk about technology without mentioning “the cloud.” In the last week, two big name analysts have released reports, each predicting a public cloud services market that reaches northward of $100 billion by the next presidential election.