When Social Networks Become Social Clouds
Following the first hints of news about the tragedy in Japan, people around the world turned to the Internet to find different formats for information—not just mass media coverage, but also firsthand impressions left on personal websites, blogs and social media outlets. During the Japanese disaster, a combination social networks and the principles of cloud computing became the primary source for information gathering and sharing.
In the past years, the number of individuals who have deposited a great amount of their time into social networks has increased. Statistics speak volumes–Facebook weighed in at 600 million users active in January 2011, Twitter tallies 190 million users tweeting 65 million times a day in July 2010 or LinkedIn lauched a figure of 90 million in January 2011.
Depending on what the user wants to obtain, socially speaking, the chosen provider will be different. For example, LinkedIn would be selected for maintaining professional records of those contacts the user knows personally and letting them know the own latest work achievements, Twitter would be a way of instant information exchange with peers that the user doesn’t need to know personally, and Facebook would be a great way to reconnect with school buddies and share persistent information such as vacation pictures.
In fact, this last Facebook example is the one I use when giving general talks on cloud computing. I issue the following question to the audience: “Do you save locally the pictures which you have been tagged in?”. The answer is always negative, as everybody understands that the pictures are already there in the social network, available always when needed without caring about what is underneath. Does this last statement ring a bell?
So basically, more people than expected are using cloud computing without even noticing it.
With the advent of information technology, we have been able to use Internet for getting the latest news on the Japanese disaster. But more important, many of us that knew somebody living in the raising sun country, needed a way to contact them. This way, the news regarding the person finder service provided by Google spread instantly across Twitter. Once a user knew this service existed, a new tweet was making its way to her/his contacts, making this important information available to almost every user of the network. The important fact here is that this valuable information arrived, no matter who was between the source and me. Also I find interesting that a cloud computing service was being announced by a social cloud, resulting in a perfect integration between clouds.
The importance of social clouds has been considered by mass media, resulting in all big players having a Twitter account for making their last second announcements. During the Japanese disaster, companies/agencies such as Reuters, BBC (which operates with 3 different accounts depending on the information type), Reuters or Al Jazeera got an increase in their follower number. At the other end, users create their unique set of information providers which maintain their timeline with updated news, no matter which is the source.
But we cannot underestimate the power of individuals. Many users came to social clouds like Twitter to gather first hand impressions instantly and provide some feedback in form of vital information. This is the case of one of my Japanese contacts and cloud computing expert, who wrote a brilliant blog post entitled “What should be tweeted in the disaster” which contained basic guidelines for coping with the situation (saving internet resources, retweeting government announcements, …) and served me as an important inspiration for this article.
In resemblance with public clouds, users during the Japanese disaster are choosing among different providers depending of their needs. The Twitter example for instant information have been explained above. However, many users seek tools to interact in a more persistent way. Here, the provider would be Facebook by means of the groups of users or even, via the fundraising project within the “Causes” application operated by the American National Red Cross.
At the end, identifying social networks as social clouds is just an exercise of comparison between philosophies, considering the examples shown in this article match the definition of a cloud. These social clouds offer “Information as a Service”, allowing users to dynamically choose the sources, correlate and expand the content at will. The users do not need to understand what or who is bringing the desired information or providing the tools for exchanging information, the social cloud provides an unique interface to its services.
Finally, I would like to end this article expressing my condolence to all of those who lost someone in Japan during these days, and give all my support to the whole country in these days of sorrow. And if you consider donating money to a relief fund, I suggest that you read this important article at CNNMoney.
About the Author
Dr. Jose Luis Vazquez-Poletti is Assistant Professor in Computer Architecture at Complutense University of Madrid (Spain), and a Cloud Computing Researcher at the Distributed Systems Architecture Research Group (http://dsa-research.org/). He is directly involved in EU funded projects, such as EGEE (Grid Computing) and 4CaaSt (PaaS Cloud), as well as many Spanish national initiatives.
From 2005 to 2009 his research focused in application porting onto Grid Computing infrastructures, activity that let him be “where the real action was”. These applications pertained to a wide range of areas, from Fusion Physics to Bioinformatics. During this period he achieved the abilities needed for profiling applications and making them benefit of distributed computing infrastructures. Additionally, he shared these abilities in many training events organized within the EGEE Project and similar initiatives.
Since 2010 his research interests lie in different aspects of Cloud Computing, but always having real life applications in mind, specially those pertaining to the High Performance Computing domain.
Feeds by Topic
- Developer Tools
Feeds by Industry
July 31, 2014
- Abstract Submission Deadline for Women in HPC Workshop Extended to August 14
- Fujitsu Reports Fiscal 2014 First Quarter Financial Results
- Mark Potter, HP Senior Executive, Joins Solarflare Board of Directors
July 30, 2014
- Deadlines Extended for SC14 Technical Program Submissions
- AMD Opteron 64-Bit ARM-Based Developer Kits Now Available
- Mellanox to Present at Upcoming Investor Conferences
- XSEDE Compatible Basic Cluster Software Suite Introduced
- eInfochips to Develop NVIDIA GPU-Powered Products for HPC
July 29, 2014
- Cray Reports Second Quarter 2014 Financial Results
- Altair to Acquire Visual Solutions
- Bull Reports 2014 Half-Year Results
- IBM to Make Free Supercomputing Power Available to Sustainability Scientists
- Pico Computing and TekStart Announce Availability of HMC Controller IP
- Bright Computing Raises $14.5M to Accelerate HPC, Hadoop, and OpenStack Business
- HGST Introduces New Ultrastar SAS SSDs
July 28, 2014
July 25, 2014
- Mellanox Announces Second Quarter 2014 Financial Results
- IM 2015 Issues Call for Experience Session Papers
July 24, 2014
July 23, 2014
Most Read Features
- Los Alamos Lead Shares “Trinity” Feeds and Speeds
- China’s Supercomputing Strategy Called Out
- First Details Emerge from Cray on Trinity Supercomputer
- Compilers and More: MPI+X
- Cray Looks Ahead to Next Generation of Growth
- Exascale Resilience Turns a Corner
- TACC’s New Director Shares Strategy, System Futures
- Big Data Needs Big Funding: NSF’s Jahanian Makes The Case
- Bull Charges Ahead in Supercomputing Despite Big Data Refocus
- New HPC Benchmark Delivers Promising Results
- More Features…
Most Read Short Takes
- Human Brain Project Draws Sharp Criticism
- The Case for a Parallel Programming Alternative
- New Photoresist Could Add Years to Moore’s Law
- 3D Simulations Raise Bar for Astrophysics
- Team EPCC Reveals Secret Weapon Behind Winning Linpack Score
- Parallel Computing Trends
- Greece to Deploy Supercomputer by Year End
- Getting to Exascale
- Advancing Drug Discovery with HPC Cloud
- IBM Bets on Nanotubes to Succeed Silicon in 2020
- More Short Takes…
Most Read Off The Wire
- Cray Awarded $174M Supercomputer Contract From NNSA
- Dongarra and Team Start New Supercomputing Journal
- Top Supercomputers of India List Released
- ORNL Wins Eight R&D 100 Awards
- June Green500 List Released
- Micron Appoints Stephen Pawlowski as VP of Advanced Computing Solutions
- RIKEN Researchers Run 10,240 Parallel Simulations of Global Weather on K Computer
- AMD Reports 2014 Second Quarter Results
- Eni to Utilize New Supercomputer for Oil and Gas Activities
- IBM Announces $3 Billion Research Initiative
- More Off The Wire…
- Read more…
- Read more…
- Read more…
- Read more…
- Read more…
- Read more…
- 7/17/14 | Acceleware | Learn about advanced software engineering techniques that leverage the power of today’s parallel hardware to optimize applications for maximum Read more…
- 6/26/14 | Digipede | Distribute and accelerate: How running compute-intensive applications in a distributed computing environment can significantly shorten the time Read more…
- Read more…
- Read more…
August 20 - 22San Diego CA United States
September 22New York NY United States
October 8 - 9Royal Victoria Dock London United Kingdom
October 23 - 24Shanghai China
November 4 - 6Santa Clara CA United States