Tuesday, December 7, 2010

To launch a game changer

Thomas Friedman in his book "The world is flat" explains about the ten flatteners of the world. He also mentions the axis that changes the world - the fall of the Berlin Wall, the rise of the PC, Netscape...

Netscape had one of the most amazing deployment strategies sin the industry. Netscape Navigator was based on the Mosaic web browser, which was co-written by Marc Andreessen and Jim Clark. Clark believed that the Mosaic browser had great commercial possibilities and provided the seed money.
On 13.10.1994 Netscape announced that it would make Navigator available without charge to all non-commercial users. Netscape's initial corporate policy regarding Navigator is interesting, as it claimed that it would make Navigator freely available for non-commercial use in accordance with the notion that Internet software should be distributed for free.

However, two months later, Netscape apparently reversed its policy on who could freely obtain and use version 1.0 by only mentioning that educational and non-profit institutions could use version 1.0 at no charge.

The versions were available for free download with boxed versions available on floppy disks (and later CDs) in stores along with a period of phone support. Email support was initially free, and remained so for a year or two until the volume of support requests grew too high."

When the consumer Internet revolution arrived in the mid-to-late 1990s, Netscape was well positioned to take advantage of it. With a good mix of features and an attractive licensing scheme that allowed free use for non-commercial purposes, the Netscape browser soon became the de facto standard, particularly on the Windows platform.The media hype deffinetely helped to estabslish its leading position. Of course it became even more renowened for the on-the-fly display of web pages,where text and graphics appeared on the screen as the web page downloaded. During the 1990s, important new features included cookies, frames, and JavaScript (in version 2.0).

According to Wikipedia, the browser remained the market leader with more than 50% usage share. Industry observers confidently forecast the dawn of a new era of connected computing. The underlying operating system, it was believed, would become an unimportant consideration; future applications would run within a web browser. This was seen by Netscape as a clear opportunity to entrench Navigator at the heart of the next generation of computing, and thus gain the opportunity to expand into all manner of other software and service market.

During that time Microsoft entered the market, using the code from Spyglass to create Internet Explorer. Microsoft's browser were thought by many to be inferior and primitive when compared to contemporary versions of Netscape Navigator. With the release of IE version 3.0 (1996) Microsoft was able to catch up with Netscape competitively. In the meantime the 4.0 version of Netscape became crashy and buggy, helping the demise of the browser. Important factor was the entrance of other open-source browsers and the deal between Microsoft and Apple, that the latter will use IE as the default browser in Mac OS for Macintosh.

In March 1998, Netscape released most of the code base for Netscape Communicator under an open source license. The product, Netscape 5, used open-source community contributions, and was known as Mozilla, Netscape Navigator's original code name.

Resources:
Friedman, Thomas.The World Is Flat: A Brief History of the Twenty-First Century.2007
Netscape Navigator. (2010, December 5). In Wikipedia, The Free Encyclopedia. Retrieved 04:45, December 8, 2010, from http://en.wikipedia.org/w/index.php?title=Netscape_Navigator&oldid=400669043

Friday, November 26, 2010

Virtual teams and trust

The advent of the Internet has provided new opportunities for collaboration thought impossible just a few years ago. The term ‘virtual organization’ (VO) has been coined to show the way that technologies enable collaborators to work together in ways that parallel membership of a common institute. But the concept actually is more flexible. It might, for example, represent a formal entity associated with sharing of resources, with quality-of-service agreements and access control policies, but more generally the virtual organization is an expression of a task-oriented collaboration between members of geographically distinct institutes ("New tools to support collaboration and virtual organizations").

Due to the ever increasing trend towards globalization, virtual teams are becoming essential to enhancing a company's competitive advantage.  With the shift towards more decentralized organizational structures, companies are beginning to utilize the vast availability of individuals with high specialization and experience. This helps companies not only attain organizational goals and expand their financial success, but creates more opportunities for employees to advance their personal careers and also transform the future of any type of business.

One disadvantage of a virtual team stems from the improper use of communication channels and mediums (Piccoli, 2004). When using e-mail, chat, and other similar technologies, the richness of communication suffers because nonverbal communication is lost, making the development of a team much more difficult. This problem is magnified when dealing with individuals from multiple cultural backgrounds, which can create communication barriers and fault lines that impede the development of interpersonal relationships. Another disadvantage of virtual teams is their reliance on technology. Any type of malfunction with the technology being used will inhibit the team’s ability to interact, making it almost impossible to complete any task at that time. Additionally, communication lags are inevitable and unavoidable, further preventing the team from performing efficiently because information sharing becomes tedious (Piccoli).

One of the most important things with VO or VT is to select the right tools and technology. Find tools that foster communication and trust. It is important to ensure all team members are on the same page about what is happening in the organization as well on their dedicated projects. Social networking tools make it easy to keep the lines of communication open at all times, building deeper relationships and lessening isolation among geographically dispersed peers. The newness and social aspects of online tools can be exciting, however, it’s important to determine standards and protocols as to how and when your team will use social networking tools.When selecting the right tools, ensure they meet the needs of the diverse individuals who will use them. Do these tools assist in solving some of the typical challenges for virtual team members, such as adjustments to differing time zones, cultures and languages? For companies to grow, they must now rely on Internet collaboration and emphasize individual contributions. It is important for team members to control the brainstorm process, share research and generate new ideas online. Using social networking tools could help capture personal observations, invite comments from other team members and distribute new knowledge.

Many online teams communicate through a variety of online products such as Dimdim Online Collaboration, GoTo Meeting, Huddle, Vyew Instant Workspace, and Nomadesk, WebEx. Although the products provide ways to communicate, each varies in popularity and use, privacy and security, cost, technology, and training. Social networking sites also integrate many tools that are very useful for collaboration, unfortunately it is not practical to use many of the popular social network sites such as Facebook, or even Nature Network (a social networking site for scientists) for scientific collaboration.

One of the most important factors for virtual team will be TRUST. According to Fukuyama the culture of trust as the source of spontaneous sociability allows enterprises to grow beyond family into professionally managed organizations. Trust is the elixir of group life—the belief, or confidence in a person or organization's integrity, fairness, and reliability. This faith comes from experience, however brief or extensive. The importance of trust cuts across a team's life cycle. As trust accumulates—in teams, corporations, communities, and nations—it creates a new form of wealth. In the Network Age, human, social and knowledge capital are as potent a source of value as land, resources, skills, and technology. Trust will be one of the main ingredients in the future VT and the success of the innovative project from dispersed group members. 

One example could be Valent Software. The CEO lived in Massachusetts, the president worked from Utah, the engineering team was based in Ohio, and a few others worked out of their homes. Yet, while Valent Software’s ten employees never really co-located, they were able to sell their $700,000 investment and three years of work for $45 million to a major web portal.

Nowdays it doesn't really matter for tech startups where the geographical location of their employees are. Just check this article

Tuesday, November 16, 2010

Business Strategy Innovation Diamond

Product innovation is the engine that drives growth and prosperity for many companies . At the same time is one of the most difficult undertakings of the modern corporation. What is the reason that some companies were successful at developing and launching new products on the market even though product life cycle gets shorter, while others have had more than their fair share of failures and gone bankrupt. Professor R.G.Cooper and his partners conducted a research into around 2,000 projects and 500 companies. This research has addressed two main questions:
  1. How successful and profitable are the companies new product efforts?
  2. Are certain product development practices connected to success and profitability?
Best practice research has uncovered a common theme in organizations that excel at product innovation. Four key areas of best practice stand out as common denominators: product innovation and technology strategy; portfolio management – both strategic and tactical; idea-to-launch new product development process, and the right climate and culture for innovation. These four performance drivers comprise the Innovation Diamond.

The best-performing companies (the top 25%) turn 78% of their new products into money-makers. The best companies also execute their projects much faster and more efficiently. They have a much higher proportion of projects completed on time.

Measuring product development performance and practices is one of the most important best practices. But unfortunately it is also one the weakest areas according to Cooper’s research because most companies don’t measure and don’t know how well (or poorly) they are doing.  Almost a third of the companies do not measure systematically measure their product success rate or adherence to budgets and time schedules for individual projects and for the company as a whole. Post-launch reviews are not done or poorly done in most businesses. Also new product development project teams are typically not accountable for the business-performance results of their projects.

It was discovered that many senior managements do not keep score in NPD (New Product Development) – overall NPD results are not measured and new product results are typically not part of senior management’s personal objectives or reward systems. Furthermore most companies do not measure how well their product development process is working.

The BSID focuses your organization on making sure that the policies support the strategy, that the processes facilitate the policies, that the systems enable the processes, and the reporting measures the execution of the strategy. Not focusing on the BSID (Business Strategy Innovation Diamond), may result in just BS instead of strategic innovation.
 











Resources:  





Friday, November 12, 2010

Organizing for innovation with open innovation

There are different opinions considering the structure of the organization and the likelihood of innovation, the level of creativity and experimentation. Some researchers support the view that the bigger the firm, the better the chance for innovation, because they could accumulate more money for R&D. It also helps with developing competences and wide range of innovation projects to choose from. In the same time - big firms have bigger bureaucratic system, thus hindering flexibility and entrepreneurship. 
Centralization was the best managerial method long time ago, but still some organizations are happy with this kind of set up. Recently, most organizations are transforming from centralized organization going through a multi-divisional set up to reach a loosely-coupled network form (Scott, 2003). The change brought by knowledge economy and information systems changes the organizations and their views toward innovation. New terms such as virtual organizations, network organizations, modular organizations are used more and more.
And what does it take to organize for innovation?
Ø  imagination
Ø  thinking outside the box
Ø  willingness to take significant risk
Ø  accept failures
Ø  openness to the new and untried
Ø  slack resources to generate and develop ideas
John Roberts, professor of economics, strategic management, and international business at Stanford Graduate School of Business says: "... firms must develop multiple business opportunities, and to continue to grow and survive they must do this on an ongoing basis".
But there is something new more and more leading companies embrace: open innovation.  Henry Chesbrough says that  “Open innovation is a paradigm that assumes that firms can and should use external ideas as well as internal ideas, and internal and external paths to market, as the firms look to advance their technology.”
There are many examples from a variety of industries, including:
TELECOM: BT (formerly British Telecom) – link
CONSUMER ELECTRONICS: Philips – link
CONSUMER PRODUCTS: P&G – link
PHARMACEUTICALS: Novartis – link
CHEMICALS: Air Products – link
IT HARDWARE: Sun Microsystems – link
FOOD AND BEVERAGE: Starbucks – link
COMPUTERS: DELL – link
US GOVERNMENT – link
In an interesting article on how Procter & Gamble is using “open innovation” to develop new products and drive growth, Stefan Lindegaard gets in to specifics on how the giant soap company puts ideas in to action. This is good, because a lot of the talk about innovation can be overly abstract.
-          They seek out ideas in 85 different networks and over 120 universities, and 75 percent of the searches result in “viable leads”
-          They have a website in five languages to encourage unsolicited submissions
-          More than half of their innovation is sourced externally
Elsewhere, Lindegaard writes of how P&G sees the future of open innovation. They also see the wisdom of crowds playing a part where consumers and communities will be tapped for ideas. He says: "There will be a tremendous amount of innovation in and from developing regions which is driven by population as well as capability growth in countries such as India, China and Brazil."

Tuesday, November 2, 2010

Copyright is failing in 21 century

Several years ago the EU commission decided to reconsider the copyright law in the information society.  The commission even talked about copyright in the knowledge economy.  One thing remains essential - the copyright law is not in touch with the changes produced by the technology innovation.

The origin of copyright law in most European countries lies in efforts by the church and governments to regulate and control printing, which was widely established in the 15th and 16th centuries. Before the invention of the printing press a writing, once created, could only be physically multiplied by the highly laborious and error-prone process of manual copying by scribes.  Printing allowed for multiple exact copies of a work, leading to a more rapid and widespread circulation of ideas and information.

Later on the governments of different countries gave licenses to printers and publishers to print certain materials (example: Stationers' Company in Britain). In pre-revolutionary France all books needed to be approved by official censors and authors and publishers had to obtain a royal privilege before a book could be published. Royal privileges were exclusive and usually granted for six years, with the possibility of renewal. Over time it was established that the owner of a royal privilege has the sole right to obtain a renewal indefinitely. In 1761 the Royal Council awarded a royal privilege to the heirs of an author rather than the author's publisher, sparking a national debate on the nature of literary property similar to that taking place in Britain during the battle of the booksellers(information from Wikipedia).

Because the incentives given by the early copyright law, printers and publishers were essentially patrons of art, spurring the development of literature and creative works.
Now, fast forward to the end of the 20th century, when the copyright law said that a work is under a copyright until the death of the author plus from 50 to 70 years after the death of the writer. If in the beginning the copyright played essential role in funding art and creativity, now it stops and prevents people from using works that should be in the public domain, spurring new creativity from existing works.

See the DRM (digital rights management) - it tried to use old approach for copyright protection in the new digital era, failing miserably. It was hacked so fast, it is an old joke right now. I am sure that every time a company decided to hinder creativity by closing platform/work or whatever there would be people who will hack it, thus making it possible for innovation. Cause you see - innovation is putting old materials/ knowledge in new and creative way. Nothing comes from thin air - and if the previous stuff are under protection of copyright law, invention would be decreased or put in a box - the way companies want to use their product, their work.  If copyright law covers such a large chunks of information and for such a long time, it will hinder the creativity for the future generations to rip, mix, and burn (as Lessig puts it).Generative system is what we need, and for it we need completely new law - take creative commons for example. Many people were skeptics that it will not work, because people would never choose to pay for something that is offered for free. But this is proven not to be true.

Besides copyright law you can choose between:

1. Creative commons: several copyright licenses that allow the distribution of copyrighted works. The licenses differ by several combinations that condition the terms of distribution.
2. Copyleft - is a play on the word copyright to describe the practice of using copyright law to offer the right to distribute copies and modified versions of a work and requiring that the same rights be preserved in modified versions of the work. In other words, copyleft is a general method for making a program (or other work) free, and requiring all modified and extended versions of the program to be free as well.
3.  GNU General Public License is a free, copyleft license for software and other kinds of works

Some examples: 


Sunday, October 24, 2010

Acadia - Virtual Computing Environment (VCE) coalition


This coalition is a an alliance/coalition/joint venture between Cisco and EMC with investments from Intel and VMware. Acadia is aimed at next generation data centers that are designed to deliver private cloud infrastructure. By joining forces, EMC and Cisco can expand into the server market quickly, offering integrated systems built with technology from both.

The integrated systems, called VBlock Infrastructure Packages, will allow customers to buy all the equipment and software they need together, from one seller. EMC and Cisco will offer systems that would allow customers to install new software or manage company information from a single control center. The new venture will focus on designing and building systems that rely on virtualization technology, which can help customers create a more flexible technology infrastructure and lower their capital spending costs.

The alliance could be a major boost to Cisco’s effort to expand beyond the networking equipment business. In 2009 Cisco launched the Unified Computing System, a new line of server computers. This put Cisco in competition against industry titans IBM and HP, which sell integrated computers and storage products. EMC and VMware are dominant players in storage and server virtualization, technology that allows many computers to run together as efficiently as one machine, but can’t match IBM and HP in servers or Cisco in networking.

“They’ve taken server, networking, and storage and put them together into a single unit,’’ Mark Bowker, senior analyst at Enterprise Strategy Group in Milford said.

Big issue here is the openness of the platform. None of these players are known for their embrace of open software, and most are far more famous for squeezing high margins out of proprietary code. IBM and Rackspace have been pushing for some type of open cloud effort, which it defines as being built through a standards group. Vblocks are a refutation of that model, and of the idea that commodity hardware will underlie most clouds.

Sources: 

Friday, October 15, 2010

Open source as a fundamental business model

Red Hat, Inc. was founded in 1995 by entrepreneurs Robert Young and Marc Ewing, and is a market leader in open source software systems for mainframes, servers, workstations, and embedded devices. The company's core product is the Red Hat Linux operating system, which is the leading Linux system for servers. More recently the company has expanded its product line to offer open source solutions in such areas as e-commerce, embedded devices, and database solutions.

The company's mission was to market and develop its own version of the Linux open source operating system to end users. While Red Hat made its version available for free downloading, it also sold CD-ROM versions. When it released Red Hat Linux 5.0 in November 1997 for $49.95, InfoWorld called it "a complete Internet server in a box." The software package included everything a system administrator needed to get an Internet or corporate intranet server running in one day, including the Apache Web server, a mail server, a news server, a domain name server, a gopher server, and more. It also included development tools and a freeware database engine. 

Business model:

Unlike proprietary systems that carefully guard their source codes, Red Hat employs an open source software model. It opens up software code to innovation from an international community of contributors while licensing and selling software that has been tested for reliability and interoperability with a variety of popular applications. Leveraging the volunteer community of users working to improve the code, RHT sells its software for prices often under-cutting their competitors.

Red Hat sells subscriptions for the support, training, and integration services that help customers in using open-source software. Customers pay one set price for unlimited access to services such as Red Hat Network and up to 24/7 support. 

Revenue comes from two sources: software subscriptions and training and services. The company makes about 85% of its revenue from software subscriptions, which include set-up, ensurance of compatibility with existing software, and other services involving upgrades and trouble-shooting. In fact, Red Hat derives its revenue from these services rather than the software alone. The remaining 15% from services that include advanced technical support, hourly consulting, engineering, and customer training and education services.

Red Hat also pursued a strategy of building tight partnerships with industry-leading technology companies. The company's first round of strategic financing was completed in September 1998, when it received financial backing from Intel and Netscape Communications as well as from venture capital firms Benchmark Capital and Greylock Management. It completed a second round of financing in March 1999, when computing industry leaders Compaq, IBM, Novell, Oracle, and SAP took minority equity positions in the company, indicating their commitment to the development and adoption of Linux operating systems.

It entered in strategic partnership with IBM and Dell. Under the agreement, developers from IBM and Red Hat would team to move Linux onto IBM's Netfinity servers, PC 300 commercial desktops, IntelliStations, and ThinkPads.

Red Hat filed for its initial public offering (IPO) in June 1999 and hoped to raise about $96 million. The company had revenue of $10.8 million for its fiscal year ending February 28, 1999, compared to revenue of $5.1 million in fiscal 1998. On August 11, 1999, Red Hat shares began trading at $14 a share. By the end of the day they closed up 227 percent at just over $52 a share.

In November 1999 it acquired Linux pioneer Cygnus Solutions for $674 million in stock. Cygnus's business included producing compilers and debuggers, developing embedded software for handheld devices and other appliances, and application development tools.

In January 2000 Red Hat acquired e-commerce software vendor Hell's Kitchen Systems Inc. for $91 million in stock. HKS's payment processing software was a key component for e-commerce operations. The acquisition helped Red Hat position its Linux offerings as a more viable software solution for businesses that wanted to migrate their operations online. Around the same time Red Hat entered into a strategic partnership with electronic security firm RSA Security Inc. that added encryption capabilities to Red Hat's Linux software.

Red Hat's next major acquisition involved performance management software vendor Bluecurve for $35 million in stock. The acquisition enabled Red Hat to offer a tool for monitoring Linux systems. Bluecurve software allowed developers to simulate transactions and scale their infrastructures to different levels of service.

In September 2000 Red Hat announced the creation of the Red Hat Network, a new Internet-based subscription service for developers and users. Subscribers would be able to access open source advances, upgrades, and security features. Developers could register information on their hardware and software systems, thus facilitating joint projects. The Red Hat Network also included a number of support services from open source experts, tight integration with the Red Hat Package Manager, and customizable update management services. The Red Hat Network also made it easier for Red Hat to deploy upgrades and security patches over the Internet.

Red Hat proposed an open source code, eCos, for 2.5 and third generation wireless devices. Plans called for the new source code to be developed by Red Hat in association with 3G Labs. The new operating system would be based on Red Hat's open source embedded real-time operating system, not Linux.
The company continued to expand its product offerings with the introduction of a new open source e-commerce software suite.

It represents a fundamental shift in how software is created. The code that makes up the software is available to anyone. Developers who use the software are free to improve the software. The result: rapid innovation.

Principal Competitors:

Caldera International, Inc.; Microsoft Corporation; SuSE Linux AG; Turbolinux Inc.; VA Linux Systems.

Utilities and tools:

Over and above Red Hat's major products and acquisitions, Red Hat programmers have produced software programming tools and utilities to supplement standard Unix and Linux software. Some of these Red Hat "products" have found their way from specifically Red Hat operating environments via open-source channels to a wider community. Such utilities include:
  • Disk Druid (for disk partitioning)
  • rpm (for package management)
  • sosreport (gathers system hardware and configuration details)
  • systemtap (tracing tool for Linux kernel, developed in collaboration with IBM, Hitachi, Oracle and Intel)
  • NetworkManager 



    Tuesday, October 5, 2010

    First movers, early followers and the future of the virtual worlds.


    In 1992 Neal Stephenson's  novel Snow Crash presented us with the Metaverse,  where humans, as avatars, interact with each other and software agents, in a three-dimensional space that uses the metaphor of the real world.  All this was nothing but science fiction then - the technology was not mature enough, nor the network to get the vision to its own life.

    In 1999 a hardware company called Linden Lab was found that was geared towards the research and development of haptics (http://en.wikipedia.org/wiki/Haptics). They needed a virtual world to go with their hardware, and so in 2001 they started building what became Linden World and, later, Second Life, the 3D virtual world with user generated content, where users could interact with each other in real time.

    Linden World was spread almost seamlessly across multiple servers (albeit, only a couple), and it was envisioned that one day it might become a sprawling and distributed agglomeration of third-party servers. The streaming content architecture and protocols allowed people to create content and to participate in content creation in real-time - without drowning their connections in data.

    This company was actually a first mover to an industry that will gain momentum and hype throughout the world. But in the beginning nobody wanted to fund it - people did not see advantages to use the platform and the idea of a game. In one meeting the vision changed - the investors attention went from the game to the seamless, real-time, collaborative content-creation platform. 

    Soon "The rig" would be renamed to "Second life" and would be released to a limited testing audience in 2002. In 2003 the beta testing version was distributed. People started generating content and trying to evade taxes, but essentially Second Life was a lone player on the market for virtual reality. That also meant financial troubles. Regardless the marketing campaign of the Rosendale, the platform didn't generate a lot of users. In 2004 the complementary software started emerging - Tringo for example is an online multiplayer game, available to play inside the virtual reality platform, was created by Nathan Keir (aka Kermitt Quirk).

    And then it happened - due to technological advancement as quick time media streaming, changing the user interface to easier one, better connection speeds, basically empty world, waiting to be filled with people's vision and the perceived millionaire advantage  - Second life became a mass hit. Business Week  wrote about it, firms started their virtual offices there and wrote press releases about them, media outlets went crazy with the stories. 

    This in turn generated interest in the regular user, who could download the platform, log in the virtual universe and create whatever vision he/she had in mind.  Of course this was a golden mine for banks, schemes and casinos, pornography, and etc. A lot of companies as Adidas, Nike, Cisco, IBM, HP saw it as a gold PR opportunity. Many universities followed suit and establish online classes. The book publishing industry did not hesitate and actually a publishing house from Britain carried out the first virtual book fair in 2007. Publishing houses like Random House, Penguin UK, Wiley and others.

    And the bigger the hype, the more users it generated. It also generated early followers:
    "Utherverse" -  is the universe or "Metaverse" that engulfs all of the cities that make up the environment. This is the major competitor of Second Life.
    SmallWorlds - its beta was launched late 2008, it do not need download - use it from your web browser, it is highly customizable and took a lot of the users from Second life cause it was able to deal better with the generated traffic, so needed by brands. 

    Other competitors are IMVU, Active Worlds, Onverse, Kaneva and Blue Mars.

    GOOGLE also decided to create its own virtual world and created a lot of talk and interest in the initial stages but failed miserably with their web based virtual environment - "Lively". The project was alive less than an year before its discontinuation on 31.12.2008. Not only their world failed to generate interest, but probably the time was not the right one to enter the segment. Google were at least smart enough to stop the project fast enough.

    With the advancement of technology, available interest from people it is only a matter of time for the virtual life to become a regular part of life. Though there are a lot of lessons to be learned from the early market entrants - problems with funding, technological problems, legal difficulties, there is a lot of space to be explored also. One of the big pluses is ROI advantages for business and online classes for academia. 

    Resources: here and here, virtual news, and from wikipedia

















     

    Tuesday, September 28, 2010

    The new paradigm and future problems

    One phrase is extremely popular lately - "cloud computing". Some people call it new paradigm, the next thing, the new wave, or even sexy, but it could actually lead to some problems.


    Cloud computing is a general term for anything that involves delivering hosted services over the Internet. These services are broadly divided into three categories: Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS). The name cloud computing was inspired by the cloud symbol that's often used to represent the Internet in flowcharts and diagrams.

    A cloud service has three distinct characteristics that differentiate it from traditional hosting. It is sold on demand, typically by the minute or the hour; it is elastic -- a user can have as much or as little of a service as they want at any given time; and the service is fully managed by the provider (the consumer needs nothing but a personal computer and Internet access).

    You could have public cloud, community one, private, a hybrid one and you probably think that you've never used it, but think again - Zoho, Google apps, online storage for you files, online backup system, even GMAIL and HOTMAIL, Skype, Google Voice, Facebook, Twitter, LinkedIn, Picassa, YouTube, Flickr, and torrent sites - they all are using this paradigm.

    Cloud services free businesses and consumers from having to invest in hardware or install software on their devices. They reduce maintenance and hardware upgrading needs; because the solutions are all Web-based, even older computers can be used to access cloud services.

    For mobile workers especially, cloud computing provides incredible flexibility: professionals can work from any computing device anywhere as long as they have access to the Web. It also makes collaboration easier especially for virtual team.

    And with all the positive sides of the technology, you also get the downside: you need to have an Internet access, the service should have 99.9% uptime, security issues, and how reliable the service you are using is in terms of could it go out of business soon and leave you without your entire collection of pictures, files, work, music, and etc.

    The other major problem is different clouds. You can use Google, Amazon's cloud, Microsoft's one, and a thousand more - for every service, software or job you need, but there is no interoperability between them. 
     And the world of cloud computing APIs has been constantly evolving since this highly-scalable architecture first gained attention less than five years ago, but it is an area with great expectations and little commanding consensus of architecture - so far. There is a lack of consistency in cloud APIs. There is also no long term vision about where the paradigm is going.

    IBM published the Open Cloud Manifesto, a document that was supposed to lay the tracks for openness and interoperability in cloud computing but which was rejected by the industry's major players - Google, Amazon, Salesforce.com and Microsoft. The whole paradigm needs not only standards compliance today, but also a unified approach to evolving those standards and a way to deal with future problems.
    When YuvalShavit asked Steven Yi, Microsoft’s director of product management for the Azure Services Platform, how he thought the cloud manifesto should have been drafted, and what Microsoft would like to see in it, his answered: “I don't necessarily think there needs to be a manifesto as long as we're delivering on customer needs” and interoperating with other services, he said. During their talk Yi emphasizes also that Azure is standards-compliant and that the company is constantly talking to its developers.

    The cloud computing it is still in it's infacy, and this approach that approach is working for now, but it doesn't leave much room for the inevitable differences that will arise between cloud computing platforms.