Source – forbes.com
Moshe Yanai has been re-thinking the box for more than 40 years, inventing new configurations of the large containers in which we keep the data generated by our constantly expanding digital lives. Data is eating the world and Yanai has been staying on top of its exponential growth by applying smart algorithms (what is now popularly called “artificial intelligence”) to cost-effectively store and manage data.
First at Siemens-Nixdorf, then at EMC, he went against IBM, the dominant data storage player at the time, and won. Then Yanai re-configured storage twice again, with XIV which he sold to IBM in 2007, and now as founder and CEO of Infinidat, a startup valued at $1.2 billion (in 2015) which grew 144% last year.
Along the way, he also figured out disruptive innovation. He calls it “alchemy, creating gold out of sand.” Here’s my distillation of Yanai’s 5 ingredients of a successful alchemy process, of coming up with the right invention at the right time and transforming it into a winner in the marketplace.
Yanai’s 5 rules for disruptive innovation:
-
- Study the conventional wisdom and find out exactly what’s wrong with it.
- Focus on how you put your product together, on mixing software and hardware in a way that ensures the product is hardware-agnostic.
- Keep in mind the cost of operating and maintaining the product, including ease-of-use and minimizing human error.
- Listen to what customers want but figure out what they actually need.
- Hire the right mix of people and drive them to success by motivating them with an exciting and unconventional vision.
“The growth of data is non-linear but there are linear means to solve it,” Yanai sums up his career in innovation. One of these means is challenging the conventional wisdom about innovation itself. In the last 20 years, ever since Clayton Christensen popularized the concept of “disruptive innovation,” the conventional wisdom has dictated that to “disrupt” a market, you must use low-end components to create a product used by a low-end market segment ignored by the incumbents, a product that eventually becomes mainstream, a “disruptive innovation” upending the existing order.
Throughout his career, Yanai has done exactly the opposite. Again and again, he has targeted the high-end of the data storage market with low-end components, challenging the conventional wisdom that “if you want to develop a fast car, like a Porsche, you must put expensive parts in it.”
His first data storage product, for mainframe computers, initially developed at Israeli firm Elbit (which later transferred it—with Yanai—to Nixdorf, which was later acquired by Siemens, another German firm), used mini-computer drives far lower in quality than the reliable, fast, and expensive disk drives used in the dominant and expensive data storage solution from IBM.
When Yanai joined EMC thirty years ago, IBM was still dominant (80% share of the mainframe storage market) with a reliable, fast, and expensive product. This time he used slow and unreliable PC drives to create Symmetrix, the product that within five years of its introduction took over market share leadership from IBM and then proceeded to dominate the “open systems” (i.e., non-mainframe) data storage market.
Before the rise of EMC to data storage prominence in the 1990s, IBM and its handful of competitors put the computer (later to be known as a “server”) at the center of their product development and sales, relegating the data storage device to the secondary role of a “dumb peripheral.” The result was that faster and faster processors, comprising the central processing unit (CPU), the most important part of the server, were “all waiting at the same speed,” waiting for the mechanical disk drive to spin and deliver the data to them. This performance gap between the mainframe processor working at electronic speeds and its 14.5” proprietary disk drive became to be known as “the I/O bottleneck.”
This was the context for Yanani and his team’s multiple challenges to the prevailing conventional wisdom. They used off-the-shelf 5.25” PC disk drives and augmented them with processors dedicated to the storage device and with a significant amount of cache or electronic memory. In so doing, they transformed the storage device from a “dumb peripheral” to an intelligent machine. Sophisticated algorithms and powerful microprocessors sitting on one hand between the cache memory and the CPU and on the other hand between the cache memory and the disk drives all worked together as a smart traffic cop moving the data at electronic speed between the storage device and the CPU rather than wait for the disk drive, a mechanical device, to spin slowly looking for the data requested by the CPU.
“But it’s not just the product, you need to have the right timing,” says Yanai. “There’s an inflection point when the solution makes sense to the customer and the market.” Symmetrix, this new intelligent storage device, was introduced in 1990, when the speed of shuttling data back and forth between storage and server became of paramount consideration. Why? “That was the time of the emergence of relational databases,” says Yanai.
The late 1980s and early 1990s witnessed the first big data wave, rising to challenge large corporations and their IT departments. Here’s one example: A favorite slogan of the time in the insurance industry was “one and done.” Increased competition led to increased focus on customer satisfaction which led to the desire on the part of insurance companies to answer a customer’s phone inquiry without transferring the call to another department or multiple departments. To achieve “one and done,” the call center agents needed to have all the relevant data (e.g., home insurance, car insurance, etc. for a specific customer) at their fingertips which meant larger and larger databases.
At the same time, insurance companies started to keep records of their transactions and interactions with customers for a longer period of time because their marketing departments discovered that they can mine the data to find new ways to reach customers (and non-customers) and serve them better. What all of this new fascination with data, aided by a new way to organize it (relational databases), meant was that the IT department ran out of time to backup the growing mountains of data. A fast data storage device allowed for a backup to happen in two hours rather than eight hours (which was typically the absolute limit to the period overnight in which the computer system could be taken down to perform the backup).
Bigger and bigger data required not just speed but also the ability to scale, to match the growth of your storage capacity to the growth of your data. “You needed something that will scale at the right cost and the right reliability,” says Yanai. The low-end hardware components (used in fast-growing, large markets such as the PC market, benefiting from economies of scale) ensured “the right cost,” and the software ensured “the right reliability.” That appealed to business executives finding new business reasons to collect and mine data and to IT executives striving to meet the new challenges (not just backup) of managing it.
But beyond keeping the initial cost to the customer reasonable and competitive, Yanai’s rules of disruptive innovation call for paying special attention to the cost of maintaining and operating the product over its life. In the case of Symmetrix, that was part of the package—the 5.25” PC drives and the other hardware components were housed in a much smaller container than IBM’s with its 14.5” disk drives. That translated into large savings for Symmetrix customers in the form of freed-up data center floor space.
And Yanai’s team went beyond savings on expensive data center real estate to invent myriad ways for reducing the operating cost of the device. One such invention, for example, a by-product of the need to compete with IBM and its vast field maintenance organization, was installing a phone in each Symmetrix and using sophisticated algorithms to monitor the health of various components. When the automated analysis indicated a component was going to fail, a “call home” was initiated, triggering the dispatching of an EMC field engineer with a replacement component, or in more severe cases, remote intervention by EMC engineers. This was an early example of how artificial intelligence-driven automation leads to human augmentation, allowing a small company to compete with the vast resources, human or otherwise, of a much larger competitor.
It pays, sometimes, to go against the conventional wisdom and EMC’s stock was the best-performing stock on the NYSE in the 1990s. This remarkable stock market run ended with the collapse of the dot-com bubble. But data continued on its non-linear growth path with the dot-coms and their successors creating new ways—mostly for consumers—to generate, share, and use data.
The new data-driven companies such a Google, Facebook, and Amazon built their own IT infrastructures to manage the unprecedented data management requirements of their businesses, rather than rely on existing IT vendors. They wanted to overcome “the cost, operational complexity, and limited scale endemic to datacenter networks” at the time, as one Google paper puts it. To achieve these goals, they used off-the shelf hardware components, creating a “converged infrastructure” of storage, server, and networking, and developing innovative software to manage it all.
Successful in managing the new requirements of managing very large and constantly growing amounts of data in different formats and from different sources (now called “big data”), Amazon and Google (joined by one old-timer, Microsoft) started to compete with traditional IT vendors by providing remote, “cloud”-based, IT infrastructure services. Facebook launched another type of attack on traditional IT vendors by spearheading the Open Compute Project, sharing widely the nuts and bolts of managing the new type of IT infrastructure capable of meeting the new, Web-scale requirements.
The answer from the traditional IT vendors (and startups competing with them) to these new market requirements and new competition was to continue to follow the trajectory of Moore’s Law and pin their hopes on new technologies. In data storage, that meant substituting disk drives with all-electronic flash memory. “In 2010,” says Yanai, “there was a new notion. If you are dealing with digital data, there is no room for mechanical parts in your world, everything has to be electronic. Eventually [so the argument went] the cost of flash will be much better than the cost of spinning drives.”
Relying on flash, on moving data from and to the storage devices in electronic speed, became the conventional wisdom. “Analysts say that the days of the disk are over,” observes Yanai. He recounts how a CIO of a large company told him he would not let any mechanical device into his data center. “My bet was that his need was stronger than his perception,” Yanai sums up his most recent venture in defying conventional wisdom. He established Infinidat in 2011 to develop a product that will best answer what he thinks is the most pressing challenge for his potential customers (including the aforementioned CIO, now an Infinidat customer).
What the market needs today, according to Yanai, is a cost-effective way to provide “real-time performance for data analytics.” The new wave of data unleashed by the Web, social media, the proliferation of mobile devices, all the new sources and formats of data that have emerged over the last twenty years (mostly) in the consumer space, is now washing over the enterprise. The excitement over “big data” a few years ago represented the Googlization of traditional enterprises that wanted to collect lots of data or tap into external data stores. The current excitement over machine learning and “artificial intelligence” represents the desire of traditional enterprises to do something with all this data, analyze and mine, find new insights, and new ways to “monetize” the data and create new revenue streams.
This “digital transformation” of all enterprises, the need to analyze lots of data and, when needed, perform the analysis as the data is generated (“in real-time”), is the market inflection point Infinidat is addressing with its innovative data storage device. Other companies are also responding to this market need. But yet again, when everybody zigs, Moshe Yanai zags. And again, the core philosophy is not to be a slave to the storage media, but innovate in software and focus on the overall cost-effectiveness of the product.
“To scale, you need the cheapest available media,” says Yanai. “But cheap media comes with performance penalty and [reduced] reliability, so you need to find some trick to get out of it the performance, reliability, and availability required.” Infinidat engineers came up with multiple software tricks to compensate for the lesser attributes of the nearline SAS drives they use, drives that are typically used for “cold storage,” for archiving infrequently accessed data. These tricks include using two layers of cache, converting the disk drive’s random access to faster sequential one, and automatically adjusting data management decisions in real-time based on the customer’s data access patterns.
The end result is highly efficient (both in terms of data density and power consumption), highly reliable, multi-petabytes, affordably scalable data storage (a petabyte is one million gigabytes). It moves data so fast, that Infinidat has issued a challenge to providers of all-flash data storage products, allowing enterprises to test the different approaches in their own data centers. Infinidat is confident they will find out they can have it all—cost, capacity, reliability and performance.
There is a growing roster of enterprises that have been convinced that Infinidat’s data storage has all the necessary attributes at a competitive cost. Once they start using it, however, they discover how cost-effective it is over the life of the product (see here and here, for example). “Most of the problems in the computer room today are due to human errors,” says Yanai. “Simplicity takes care of that.” The benefits of Infinidat’s simplicity include ease-of-installation, the minimal training required to operate the product, reduced time needed to perform various data management tasks, and savings on floor space and energy costs.
The unconventional product is coupled with an unconventional approach to building a company and its business strategy. Startups typically start at the bottom of the market and climb slowly to the top, explains Yanai. They also start selling their product before it’s ready, expecting to learn, adjust, and even pivot, based on early customers’ feedback. Yanai did exactly the opposite.
“The idea was too crazy, I wanted to make sure we can really do it. We spent three years perfecting the product.” And the company went after the top of the market, addressing the customers with the most demanding requirements, because “we shine at the multi-petabytes level.”
Last but not least in Yanai’s list of ingredients for a successful disruptive innovation is assembling the right team and providing them with an exciting and motivating vision. Three generations are represented in the 500-plus people of Infinidat: “The very young, the people with ten or twenty years of experience, and the grandpas,” says Yanai. When the company encounters an especially difficult problem, they organize a swat team with all three generations represented on it, benefiting from multiple perspectives and experiences.
As for motivation, Yanai tells an apocryphal story about Winston Churchill. When German submarines in the second world war became a serious problem, Churchill told his engineers “I have an idea: Electrify the ocean.” They asked: “How are we going to do that?” and Churchill answered: “I gave you the idea, it’s your job do it.” Says Yanai: “I tell my Infinidat colleagues ‘electrify the ocean’ and they find out how to do it.” If you don’t invest in people, he adds, “they will not electrify the ocean.”
Given Infinidat’s reported revenue growth and its assertion of being profitable since late 2016, electrifying the ocean and the magic of alchemy seems to be working for it. A sweet spot for Infinidat looks to be the smaller cloud providers around the world. They (and Infinidat as their data storage supplier) may represent the new generation of the cloud, significantly more cost-efficient than the home-grown IT infrastructures of Amazon, Google, Microsoft and the other large cloud providers.
Another market segment with fast-growing multi-petabytes of data where Infinidat’s value proposition may have a ready and willing audience are the score of enterprises who have been going through a significant digital transformation in recent years and have found new ways to turn data into gold. Just like Infinidat.