DAVID B. YOFFIE 015. 9 Mobileye: The Future of Driverless Cars For the exclusive use of A. Walser, 2015. In 1999, more than a decade before Google captured the imagination of the world with the idea of a self-driving car, Professor Amnon Shashua of Hebrew University in Jerusalem, and entrepreneur Ziv Aviram had a dream about how to build the next generation autonomous car. Rather than rely on unreliable radar or expensive lasers, they believed that a single, cheap camera combined with sophisticated software could reduce collisions, prevent accidents, and save lives. Fifteen years later that dream was coming true: Mobileye went public on the NYSE in August 2014, with a valuation that quickly exceeded $11 billion. According to Aviram, everyone told them they were on the wrong track in the early days: they had picked the wrong technology, the wrong functions, and the wrong customers. But with 285 car models and 20 car manufacturers already committed to their technology, Mobileye was on a roll. Within a few years, Mobileye forecasted that it would capture roughly an 80% share of the autonomous driving systems in the world. Success meant that Shashua and Aviram were newly-minted billionaires—at least on paper. They were also close personal friends, who met through their wives. They enjoyed riding mountain bikes and motorcycles together, and even taking family vacations together. Shashua, as Chairman, and Aviram as CEO, described their partnership as two-in-a-box, an organizational innovation developed in the 1970s by Intel. Shashua handled all of the technical work in this deeply technical company, while Aviram managed the business side. On strategy, they made all decisions together. Sitting around the table in Aviram’s 6th floor office, in a non-descript building in Har Hotzvim (the high tech park) on the outskirts of Jerusalem, the two partners debated two dilemmas about the company’s future. First, car companies were notorious for squeezing their suppliers on price. Aviram believed that part of Mobileye’s success had come from maintaining stable pricing. Yet Mobileye’s volumes were about to explode, and some car companies were threatening to look elsewhere if prices did not come down in the lower-end segments. Shashua and Aviram discussed whether Mobileye should sacrifice margin in order to retain share, or continue to hold firm on price? Second, they also debated their role in self-driving cars. This debate had immediate relevance because Shashua was about to give a talk at Google’s campus in Mountain View. He also expected to meet with Google’s team in charge of its self-driving car. Naturally, Google was going to have many questions about Mobileye’s technology and business strategy. In fact, Mobileye had developed its own self-driving car. For Shashua and Aviram, this upcoming visit raised obvious questions about the efficacy of Google’s approach, what role Google vs. Mobileye would play in the future of self- driving cars, and was Google a potential competitor or partner for Mobileye? Professor David B. Yoffie prepared this case with the assistance of Research Associate Eric Baldwin. It was reviewed and approved before publication by a company designate. Funding for the development of this case was provided by Harvard Business School and not by the company. HBS cases are developed solely as the basis for class discussion. Cases are not intended to serve as endorsements, sources of primary data, or illustrations of effective or ineffective management. Copyright © 2014, 2015 President and Fellows of Harvard College. To order copies or request permission to reproduce materials, call 1-800-545- 7685, write Harvard Business School Publishing, Boston, MA 02163, or go to www.hbsp.harvard.edu. This publication may not be digitized, photocopied, or otherwise reproduced, posted, or transmitted, without the permission of Harvard Business School. 9-715-421 REV: FEBRUARY 20, 2015 This document is authorized for use only by Anand. Walser in 2015. For the exclusive use of A. Walser, 2015. 715-421 Mobileye: The Future of Driverless Cars The Vision of Assisted Driving and Self-Driving Cars Amnon Shashua was described by his partner, Aviram, as a “brilliant computer scientist” who made Mobileye possible. Soft-spoken, but confident, the Hebrew University professor received his Ph.D. from MIT’s Artificial Intelligence Lab in 1993. An expert in visual systems and machine learning, Shashua had previously founded a company that developed camera-based machines to perform detailed inspections of auto parts.1 Shashua said that he “learned a lot from my first company,” where he built a product for Toyota, which only Toyota wanted. “It was a miserable failure,” noted Shashua, “the venture capitalists (VCs) took control of the company, fired the CEO, asked me to step down as Chairman, and ultimately sold the company for half of the invested capital.” When he started Mobileye, he wanted to do things differently. The first difference was that he found a trusted partner in Ziv Aviram, an industrial engineer who had emigrated from Russia to Israel when he was 9 years old. A former army commander who had led 100 soldiers into battle, Aviram had a background in industrial engineering. Described by Shashua as a “financial and managerial genius,” Aviram believed that management was a profession which could be applied across industries. Aviram had been CEO of Israel’s largest bookstore chain, CEO of the country’s biggest shoe retailer, and the CEO of a water park, before founding Mobileye. Shashua and Aviram’s vision was to put Mobileye in the center of Advanced Driver Assistance Systems (ADAS). The world of assisted-driving and ultimately self-driving cars was replete with acronyms, ranging from LDW (Lane Departure Warning), and FCW (Forward Collision Warning) to TSR (Traffic Sign Recognition) (see Exhibit 1 for a list of acronyms). These systems were designed to actively improve safety and avoid accidents, while airbags and seatbelts were passive safety measures designed to save lives after an accident. The next phase in ADAS was fully autonomous vehicles, which were still in the testing stage in 2014. However, many of the underlying technologies were already available. ADAS ranged from simple systems that warned the driver of an impending problem (e.g., that the car was beginning to drift out of its lane), to complex systems that actively took control of the vehicle (e.g., by steering the vehicle back into its lane or applying the brakes to avoid a collision or hitting a pedestrian). Mobileye used a single camera to scan the road ahead, identify obstacles, road signs, traffic lights, etc., then interpret the image and send signals to the driver or other car systems to take evasive action (see Exhibit 2 for pictures of how Mobileye works). Shashua and Aviram saw many of these capabilities coming, before most of their customers and long before many competitors. They also believed that they could deliver many of these functions with a single, low cost camera. Their theory, described by Shashua, was that: We understood early on that the camera should be the primary sensor. We began developing vehicle detection from a single camera back in the year 2000 when the industry believed radar would be primary. We began developing pedestrian detection back in 2002 when the industry was not even contemplating the necessity. Mobileye was the first to launch a Pedestrian Collision Warning feature in 2010. We were the first to launch FCW for detecting licensed vehicles back in 2011. In 2013, we were first to launch Autonomous Emergency Braking (AEB) on vehicles and pedestrians using only camera processing. In 2013, we were the first to launch Adaptive Cruise Control (ACC), which actively adjusted the speed of a vehicle to maintain a safe following distance during highway driving, from a camera. To date, competitors have not introduced any of those functions on a single camera. (See Exhibit 3 for list of Mobileye pioneering innovations.) Beyond the technology, Shashua said that many of the key elements of the business model were also clear “from Day 1.” He knew, for example, that regulation would drive demand because the 2 This document is authorized for use only by Anand. Walser in 2015. For the exclusive use of A. Walser, 2015. Mobileye: The Future of Driverless Cars 715-421 technology saved lives. He admitted that his biggest surprise was that it took 8 years to launch. “If I thought it could take until 2007,” noted Shashua, “I probably would not have done it.” Looking forward, Shashua and Aviram believed that the company’s success depended on both evolutionary and revolutionary changes in cars and trucks. As regulations increasingly forced companies worldwide to add ADAS to reduce accidents and auto-related deaths, Aviram noted that, “We expect that by 2017-2019 the majority of new cars manufactured worldwide will have a camera equipped with active safety features.” Perhaps more important, they believed that revolutionary trends were on the horizon, which would push the world “in leaps and bounds,” according to Shashua, into the “realm of autonomous driving. We know today that the utopian vision of completely autonomous driving—where the driver can choose to be out of the driving loop for extended periods of time—is not about to be achieved in a single leap.” In the 2016 time frame, Mobileye expected the first hands-free capable driving at highway speeds. Drivers could not go to sleep or read a book, but Mobileye had already built a prototype car in 2014 which could drive on highways without driver intervention. Stop-and-go traffic would be next, followed by country roads, and ultimately in 2018-2020, city traffic. Shashua and Aviram did not share Google’s optimistic projections that a self-driving car was only a few years away; they believed that the truly self-driving car, which allowed the driver to disengage totally, was probably a decade away. Financing Mobileye’s Growth Mobileye had a relatively focused product line: it developed a custom semiconductor chip (called EyeQ), a bundle of software applications, and a simple camera and warning display that sold in the aftermarket. (See Exhibit 4 for pictures of their products.) Since it took 14 years for the company to make these products profitable, Aviram needed an unconventional approach to financing the company. From the very beginning, he didn’t want VC money: he believed that VCs were “short- term investors,” and Aviram knew that he needed patient capital. Instead, he found a broker, and asked him to find 100 investors who would invest $5,000 each. In the end, he raised $1 million from 14 angels. His plan was to do small rounds, almost every year, with angels and friends. “My philosophy,” said Aviram, was “take more money than you think you will need. I wanted at least four years of capital on the balance sheet, but I always ran the company as lean as possible.” As major car manufacturers began testing Mobileye’s technology for detecting vehicles, road markings, and road geometry, Aviram raised $30 million in 2002, with a post-money value of $135 million.2 By 2006, Mobileye began installing systems in trucking fleets that would warn drivers of imminent collisions or unintentional lane departures.3 In 2007, BMW, GM and Volvo became the first automobile manufacturers to include Mobileye technology in safety packages for production passenger vehicles. By 2007, Aviram concluded, it was time to take institutional money. Goldman Sachs led the first institutional round at a $500 million valuation, and took more than 20% of the company on its own books. The timing was fortuitous: as its competitors retrenched during the Great Recession of 2008- 2010, Mobileye “had money,” according to Aviram, “which allowed us to be aggressive.” Growing penetration of its technology spurred additional funding that increased the company’s valuation to $740 million in 2010.4 By 2014, the company proudly announced that it had agreements to implement its technology in 237 car models from 20 automakers by 2016, plus another 40 or so unannounced models. With widespread adoption, the company experienced rapid revenue growth: it shipped 1 million of its EyeQ chips from 2007 to mid-2012, and delivered 1.3 million chips in 2013 alone (see Exhibit 5 on Chip Sales Growth). In the summer of 2013, Mobileye raised $400 million, which valued the company at $1.5 billion.5 This document is authorized for use only by Anand. Walser in 2015. 3 For the exclusive use of A. Walser, 2015. 715-421 Mobileye: The Future of Driverless Cars When Mobileye did its roadshow for its IPO in the summer of 2014, it reported that revenues had doubled in each of the past two years, from under $20 million in 2011 to over $80 million in 2013. On the strength of that revenue growth, Mobileye reported a profit for the first time in 2013, with net income of nearly $20 million (see Exhibits 6a and 6b for the P&L and Balance Sheet).6 Investors and analysts responded positively when Mobileye went public on the New York Stock Exchange (NYSE) in the summer of 2014 (under the ticker symbol MBLY). The Jerusalem Post called it “the most successful share offering an Israeli firm ever managed abroad,” despite the fact that it was during a war between Israel and Gaza, and the IPO “happened in the heat of the fighting.”7 Indeed, strong demand led the company to raise its offering price from an initial estimate of $18 per share to $25 per share when it debuted on August 1; at that price, the company raised $890 million on a market valuation of $5.3 billion. The price rose 48% on the first day of trading. When Mobileye’s valuation passed $11 billion in early September, it became the third largest Israeli company by market capitalization.8 Post IPO, Mobileye was employing over 400 people in Israel and the Netherlands (its legal headquarters), in addition to 150 people in quality assurance in Sri Lanka.9 Most of Mobileye’s costs were in R&D; the company had no factories, and a tiny sales force, mostly focused on the aftermarket. Aviram believed that Mobileye had built a unique team. He argued that “in Israel, the loyalty to a firm is unparalleled. So far, none of our people have left for competitors. We save lives. It is so visible and our people love the company. I took 100 people to NYC to ring the bell at the NYSE on the day we went public. My biggest challenge is to keep them motivated and excited, even though they are rich. We made 300 people, mostly young engineers, into millionaires.” Mobileye Technology One day in 1999, Shashua was giving a lecture on visual computing to a leading car manufacturer in Japan, when he was asked whether two cameras were required to help a car “see?” After a moment of reflection, Shashua said, “why do you need two cameras? If a driver can drive with one eye, why can’t a car do pattern recognition with one camera.” On his return to Israel, Mobileye was born. The breakthrough insight was that Mobileye could design chips and software algorithms for image-processing using a single, low-cost camera. With a small camera mounted on the windshield behind the mirror, Mobileye’s custom-designed “EyeQ” System-on-Chip (SoC) combined with powerful software would perform pattern recognition. Shashua noted that “only a camera sensor can capture the complexity and richness of the visual world and cost a few dollars.” At the same time, Shashua explained that cameras have to overcome variability in lighting and weather conditions, which posed great challenges for extracting a stable and consistent interpretation under all driving conditions. A monocular camera posed the greatest challenge because it did not provide depth perception. As a result, cameras played a limited role in early ADAS. The primary sensor at the time was radar, which was largely used for adaptive cruise control. Mobileye never manufactured the camera. Instead, Shashua and Aviram made two big strategic decisions very early in the company’s history. Aviram recounted that: The first decision was to develop all the apps in one unit. Amnon and I were in the middle of a motorcycle ride, standing on a cliff overlooking the Dead Sea. While standing there, we received a call from one of our customers, a Tier 1 auto supplier. [The difference between Tier 1 and Tier 2 auto suppliers are defined below.] The customer asked us to bid with them on a lane departure warning system. At the time, we had only developed the software for a collision warning system, which 4 This document is authorized for use only by Anand. Walser in 2015. Mobileye: The Future of Driverless Cars 715-421 For the exclusive use of A. Walser, 2015. used a single camera to recognize a car was in front of you. Nonetheless, we decided to develop the LDW application and won the bid. During this process, we realized that a full suite of applications is the key for the automotive industry. In order to achieve this, we needed to bundle lane detection, vehicle detection, pedestrian detection and other features into a full suite of applications in one unit—and not go the traditional path of one app at a time. Our second decision was related to the development of our own System-on-Chip. When we founded the company, our initial strategy was to develop software and use pre-existing hardware. We agreed with a microprocessor manufacturer that they will adapt one of their chips to automotive grade that could be used for our purposes. Then our supplier missed its first two deadlines to deliver the chip. At that point, we realized that we had to develop our own SoC, even though we had zero expertise in chip design. The decision to develop their own chip meant $3-$5 million in added investment and more than a three year delay. Many in the industry were certain this was a mistake. According to Aviram, “one of the car companies called us to Germany and told us to drop our plans for the SoC and stick to software. But we were convinced that integrating hardware and software would allow us to highly optimize power and performance for a vision system.” Mobileye found STMicroelectronics, a large semiconductor firm based in Geneva, to manufacture the chip at automotive grade and guarantee reliability. The chip cycles were roughly 3-4 years, and with each new generation of product, Mobileye’s EyeQ SoC was six to eight times more powerful than its predecessor. The added performance empowered increasingly sophisticated visual processing, which partly compensated for the lack of depth perception, and enabled more complex driver assistance and autonomous driving functionality. One crucial differentiator for Mobileye’s technology was the ability to support a wide variety of driver assistance functions using a single camera. Bosch, Continental, and Autoliv promoted so- called “stereo cameras,” which used two cameras for triangulation and greater depth. Their theory was that two cameras were better than one. Shashua argued that there was a small niche for stereo, but it was “shrinking.” Most OEMs (excluding Mercedes), he noted, were “phasing out stereo” because of the added cost (an extra 40%), increased complexity of the software, difficult calibration, poor aesthetics, and poor performance. Mobileye’s monocular camera dramatically lowered the cost for an auto manufacturer compared to multi-camera systems, as well as competing technologies (such as radar and lidar, which are discussed below). Shashua and Aviram insisted that only cameras had the potential to cover the entire range of functionalities for driver assistance systems. Just as the human eye was the most powerful sense to navigate and react to stimuli, vision-based systems were ultimately the most trustworthy for the automobile. From the very beginning of the company, Mobileye’s team had worked to position the camera as the primary sensing technology in driver assistance systems.10 In addition to integrated cameras, ADAS could be based on a variety of other sensing technologies that enabled vehicles to “perceive” the world around them (see Exhibit 7 for Mobileye’s assessment of competing technologies). At the most rudimentary level, software developers were giving away simple lane departure warning apps that would use the camera on an iPhone or Android phone. These apps could not meet regulatory requirements, and the limited computational power of a smartphone could not guarantee performance. The most widely-used alternative technologies were radar and/or lidar. Lidar, short for Light Detection and Ranging, worked like radar except that it sent out pulses of light rather than radio This document is authorized for use only by Anand. Walser in 2015. 5 For the exclusive use of A. Walser, 2015. 715-421 Mobileye: The Future of Driverless Cars waves. An advanced lidar system like the one at the center of Google’s self-driving car, took in more than a million measurements per second and used that data to construct a high resolution 360 degree image of the car’s surroundings.11 The challenge was that lidar was very expensive. The units used in Google’s cars, which included 64 lasers in a turret that rotates at 600 rpm, were developed by Velodyne and cost over $75,000 each.12 The cost of Google’s lidar was obviously prohibitive, so automakers and their suppliers were attempting to accomplish as much as possible with less expensive sensing technologies. According to one analyst, Velodyne was working on a new ~$8,000 lidar product, which was much too expensive to be competitive in cars.13 And Valeo, a large French auto supplier, was developing a highly simplified lidar system for $350, but according to Shashua, “there were few, if any, technical advantages.” In 2014, most ADAS systems in production vehicles from automakers employed radar, cameras, or a combination of both. In fact, many of the functions of driver assistance systems could be accomplished by either camera or radar-based systems, and one of the issues faced by automakers and their suppliers was to determine which technologies were best-suited to which applications. Radar had a longer range, was very good at measuring distances, and was better suited to adverse weather conditions, whereas vision-based systems possessed a wider angle of vision, were better suited at differentiating between objects (radar had difficulty detecting stationary or slow-moving objects, including pedestrians), could interpret street signs, and could detect the road surface and roadway markings. Shashua noted that “radars have improved in resolution, performance and cost over the years. Radar sold to OEMs for an average price of $100-$200; while a monocular camera system was $90-$150. As a result, a short or medium range radar system performing adaptive cruise control is price competitive to a monocular camera. Resolution and sensitivity of radars have also improved to the level of detecting pedestrians under certain scenarios.” However, the problem of radar’s inability to respond accurately to stationary objects (due to the Doppler Effect) was that radar generated far too many false positives. Sudden braking due to false positive radar-only systems had led to massive recalls in the past (e.g., Infinity JX35, Lexus Crown, Mitsubishi Outlander). Beyond the technological issues, camera-based systems were also significantly less expensive for complex applications than their radar-based counterparts. It was often the case that car manufacturers would rely on multiple sensing technologies, so-called “fusion” systems. In the long run, fusing vision with radar would be important to add redundancy and improve performance under some road conditions.14 For example, Mobileye partnered with Delphi—a large Tier 1 supplier—to implement LDW, collision warning, and automatic braking systems in several 2007 Volvos. Delphi supplied the radar systems to Volvo, as well as integrated Mobileye’s vision processing engine in a camera module. As Shashua noted, “for high end cars, where consumers are less price sensitive, fusing radar and cameras to exploit radar’s advantages in low visibility scenes, makes sense.” To stay ahead of the revolutionary changes thrusting the industry towards autonomous driving, Mobileye planned to push its technology dramatically forward. According to Shashua: To launch hands-free driving capabilities starting from a 2016 time frame requires us to push the limits of visual processing. We have been developing new capabilities such as debris detection (10cm object from 50m away), traffic-light detection, free-space estimation (driving between two objects), barriers, cones, and more. We have been designing new multiple camera configurations to support the higher safety standards required for hands-free driving. 6 This document is authorized for use only by Anand. Walser in 2015. For the exclusive use of A. Walser, 2015. Mobileye: The Future of Driverless Cars 715-421 To prepare for autonomous driving, Mobileye was already testing adding two additional forward- facing cameras—one with a 150-degree field of view to detect close objects and one with a 30-degree field for extended detection range for small objects (see Exhibit 8).15 Ultimately, side- and rear-view cameras would also be necessary for automated parking and rear accident detection. Mobileye’s Role in the Automotive Value Chain The global automobile industry was usually divided into three levels: At the highest level were OEMs, original equipment manufacturers. OEMs were the names most widely associated with cars and trucks, such as General Motors, Toyota, and Daimler Benz. Next, came Tier 1 suppliers, huge component and systems companies that sold to OEMs, such as Bosch, Delphi, Continental, and TRW. Mobileye was a Tier 2 supplier: most of its revenue came from selling chips and software to a Tier 1 company, which would then integrate Mobileye’s technology into a solution for a car or truck (see Exhibit 9). For most of Mobileye’s history, Aviram pursued an unconventional sales strategy: he focused exclusively on lobbying OEMs to specify Mobileye technology rather than work with his direct Tier 1 customers. According to Aviram, “we went only to OEMs for the first six years; we did not even meet with any big Tier 1s.” In the 2000s, automakers began to implement driver assistance systems, which were marketed as active safety features in their vehicles. By the early 2010s, virtually all major vehicle manufacturers were offering ADAS technologies as either optional or standard features in at least some of their model lines. A number of automotive manufacturers had implemented Mobileye technologies by 2013, including GM, Ford, Chrysler, Volvo, BMW, Daimler, Jaguar, Honda, Nissan, and Hyundai. General Motors was the first American auto maker to implement Mobileye technology, when it introduced a Lane Departure Warning System for three of its high-end 2008 models. Tesla was also looking towards Mobileye as their likely solution to provide ADAS. Elon Musk announced in September 2014 that his fully autonomous car should be ready in 2020, and “they will be a factor of 10 safer than a person [at the wheel] in a six year time frame.”16 BMW became the first European automaker to introduce a forward-looking Mobileye camera- based advanced driver assistance system in its 2007 5-series models. Subsequent BMW models added traffic sign recognition, intelligent headlight control, forward collision warning, and adaptive cruise control. Some models were vision-radar fusions. In 2010, Volvo introduced a pedestrian detection system developed by Mobileye and Delphi which uses vision-radar fusion and was capable of avoiding a collision with pedestrians. This was the first of its kind until 2013 when other car makers began introducing similar capabilities. In 2011, BMW announced a package of driver assistance systems, developed by Mobileye and Swedish firm Autoliv, including lane departure warning, high- beam assist, speed limit indication, non-passing indication, and forward collision warning, all controlled by a single camera powered by Mobileye’s EyeQ chip and algorithms. This launch marked the first time for a vision-only forward collision warning system that did not rely on radar. Mobileye’s technology made slower progress among Japanese automakers than in the United States or Europe. Japanese automakers began adopting Mobileye technology only after 2012. For example, Mobileye partnered with Magna Electronics (a Tier 1 supplier based in Toronto) to supply Honda with ADAS solutions (including Forward Collision Warning, Lane Departure Warning, headlight control, and headway maintenance) for the 2013 Accord and 2014 Civic. Going forward, Aviram expected an explosion in ADAS adoption. In 2014, only 2% of cars had some form of ADAS. While Mobileye’s business plans called for adoption to rise to 50% by 2020, Aviram believed it was more likely that “90% of all cars by 2020 will have some form of ADAS technology. We currently have about 80% of the vision market today, and as the market transitions to vision as the primary This document is authorized for use only by Anand. Walser in 2015. 7 For the exclusive use of A. Walser, 2015. 715-421 Mobileye: The Future of Driverless Cars sensor, we should grab an 80% share. By 2020, we expect roughly 110 million cars to be sold, up from 82 million cars to be sold in 2014.” The only two OEMs where Aviram expected a difficult road to get their business were Mercedes and Toyota, which were both developing their own in-house solutions. Mobileye partnered with a number of Tier 1 suppliers, including Delphi and TRW, who were among the largest automotive suppliers in the world. They also worked with smaller firms, such as Magna Electronics, Kostal GmbH (from Germany) and Mando Corporation (based in Korea). Mobileye typically supplied the core image processing technology while its partner supplied the image sensor and camera module. Other major Tier 1 suppliers to the industry included Bosch, a giant automotive parts manufacturer, were developing an autonomous driving system using a variety of sensing and processing technologies, akin to the Google effort. Of course, one of Mobileye’s challenges was that many Tier 1 suppliers, including TRW, Delphi, Continental and Autoliv, were competitors as well as customers. They all developed ADAS packages; some were co-developed with Mobileye’s technology, and others were based on radar, lidar, or vision-processing systems developed internally. For example, Forward Collision Warning and collision mitigation by braking systems could be radar-only, radar-vision fusion, or vision-only. Mobileye’s vision-only systems competed with radar-based systems implemented by Delphi and TRW, who were Mobileye’s partners in developing radar-vision fusion systems performing the same functions. As Mobileye gained greater traction with OEMs, it told its Tier 1 customers that it would no longer work with them if they continued to develop competing vision systems. Aviram commented that “TRW and Mando stopped their internal development programs in order to continue working with us. But Autoliv, which focused heavily on stereo vision systems and supplied Mercedes, refused. Continental, which had their own vision, lidar, and radar programs also refused; and Denso (Toyota’s primary Tier 1 supplier) had no interest in partnering with us.” While Mobileye was clearly the leader in camera-based ADAS systems, it was not completely alone in the space. For example, Aptina Imaging Corporation, based in California, had been the imaging business of Micron Technology from 2001 to 2008, when it was spun-off into a separate division and then sold to private investors in 2009. Another potential competitor was Ominivision, also based in the U.S., which was founded in 1995. Unlike Mobileye, which was focused exclusively on the automotive market, both Aptina and Ominivision developed digital imaging solutions for a variety of markets, including entertainment, medical imaging, PC cameras, mobile phones, security and surveillance, in addition to automotive. Since some of the sensing functions necessary for driver assistance systems could be carried out by other technologies, especially radar and lidar, Tier 2 suppliers of those technologies could also be considered Mobileye competitors. These included Ibeo, a German company that developed laser-scanner technologies for automated driving applications and Velodyne, located in California, which manufactured lidar sensors for use in obstacle detection and navigation for self-driving vehicles, as well as for surveying and 3-D mapping. Despite competition, Shashua argued that Mobileye had a special position in the value chain: Our position as a Tier-2 in the automotive supply chain allows us to work with the entire industry. As a Tier-1, our capacity would have been reduced somewhat as OEMs tend to work with a selected subset of Tier-1s. This constraint does not apply for a Tier- 2. We have OEMs (such as GM, Hyundai and Chrysler) that receive their EyeQ supply through multiple sources. Aviram commented that “OEMs wanted us to be Tier 1, but we refused. Tier 1s have high revenues, thin margins, and huge support costs. OEMs generally insist on multiple sources for Tier 1 8 This document is authorized for use only by Anand. Walser in 2015. For the exclusive use of A. Walser, 2015. Mobileye: The Future of Driverless Cars 715-421 bids, and expect many Tier 1s to operate on a cost-plus basis. We strategically chose to focus on a small piece of one market with very high margins.” Competing suppliers would have a number of challenges to beat Mobileye. While Mobileye heavily patented its innovations, Aviram stated firmly, “we don’t rely only on patents to protect us.” Aviram noted that “the time to penetrate an OEM account is very long. It could take up to 7 years to launch a new product, and once you are designed in, the product life cycles tend to be long as well— up to 8-10 years.” (See Exhibit 10 for details on the OEM adoption process.) For Mobileye, this meant that they could forecast with high certainty their core revenue for the next 5-7 years. Moreover, OEMs do not want multiple systems in a car. A second ADAS system would add complexity, costs, and reduce a system’s reliability. In 2014, Mobileye could offer virtually everything an OEM wanted with a low cost, single camera system. Mobileye’s bundle of applications could include Lane Detection, Lane Departure Warning, Lane Keeping and Support, Vehicle Detection, Forward Collision Warning, Adaptive Cruise Control, Traffic Jam Assistant, Emergency Braking, Pedestrian Detection, Collision Warning, Traffic Sign Recognition, and Intelligent High Beam Control. The total cost to the OEM of this bundle ranged from $100 to $150, including the camera and related components from the Tier 1 supplier. Furthermore, if a Tier 1 or Tier 2 player was not already partnering with Mobileye, getting a new system adopted necessitated extensive testing. Mobileye’s applications had to work correctly 99.99% of the time. People’s lives were at stake, and car companies demanded extensive data to validate accuracy. Mobileye’s technology had been tested by OEMs for more than a decade. Each OEM would put the cameras into a test car, and drive hundreds of thousands of miles in different locations and conditions to see where the system worked and where it failed. Aviram gave the example of an OEM driving the car on a winding road, lined with trees, casting big shadows. “The curves and the shadows threw off the system,” remarked Aviram, “but once we had the data, we could fix it.” By 2014, Shashua noted that: The innovation invested in object detection goes beyond software and algorithmic competence (that others can possibly acquire over time). Achieving the quality of object detection that we have requires amassing huge amounts of data (millions over millions of miles driven) covering many geographic areas (worldwide), time of day (daytime, dusk, nighttime), and scene variation (highway, country, city roads). Only through large datasets one can properly validate the functionality of safety functions—and due to their role as a safety function, an in-depth validation is crucial in order to avoid false-positive actuations. This kind of data cannot be produced from a single production program or from working with a single OEM. The quality of the data depends on the amount of miles driven and unbiased data. Biases occur when data are collected from a single OEM program. Frequently, an OEM will do much more testing in its home country. For example, you can find competing systems that would track lanes remarkably well in Germany. But that same system would suffer noticeable degradation on U.S. roads. We are working with 20 OEMs covering most geographies in the world. One Wall Street analyst summed up his impression of the competitive landscape in late 2014 by saying, “Mobileye continues to maintain clear superior capabilities at the moment… [but] competition will intensify over time.” He argued that no one had found a way to challenge Mobileye yet, and “it may be too late for brand new competitors to start from scratch.” Even if competitors caught up technically, the “validation period remains quite long, and OEMs may be reluctant to award high-volume global platforms to less experienced competitors since the percentage of false This document is authorized for use only by Anand. Walser in 2 For the exclusive use of A. Walser, 2015. 715-421 Mobileye: The Future of Driverless Cars pos/neg rates matters more on high-volume programs where higher incident counts can draw regulatory scrutiny and recalls.”17 The Aftermarket There was another large market, outside of traditional OEMs, which Mobileye strategically decided to serve early in its history: the standalone aftermarket. Mobileye viewed the installed base of existing vehicles—something close to 1 billion vehicles on the road—as a big opportunity. Starting in 2007, Mobileye sold cameras (which included Mobileye’s processing technology) and an audio/visual display directly to consumers and vehicle fleet managers for $350-$400 at retail. At those retail prices, margins were well over 50%. These products would warn the driver of a possible collision, pedestrians in the vehicle’s path, lane departures, intelligent high beam controls, and speed limit indicators. For truck fleets, in particular, the appeal was obvious: If you could reduce accidents, you should lower costs, including repairs and insurance, as well as improve utilization. The aftermarket accounted for 22% of Mobileye’s revenue in 2013, down from 35% in 2011.18 As the OEM business ramped in the next several years, aftermarket revenues would grow handsomely, but continue to fall as a percentage of overall business. According to Aviram: Most new suppliers in the auto industry will start with the aftermarket, and once they’ve proven themselves, they go to OEMs. Mobileye did the opposite. Yet selling our systems to cars and trucks that are already on the road has given us a lot of knowhow. We have learned how to design better cameras and camera housing. Once we figure it out, we can provide reference designs to our Tier 1 customers. The aftermarket is also a great marketing tool to convince OEMs that our solutions really work. When Volvo saw our Pedestrian Detection system working in the aftermarket, they wanted it for their cars. The aftermarket also gives us real proof of concepts for regulators. Once we can demonstrate improved safety on the road, regulators are more comfortable mandating the technology. In Israel, for example, the regulators saw the data on improved safety with our system and decided to refund import duties on cars coming into the country, if Mobileye is installed. Driving Adoption: New Applications and Government Regulation The early days of ADAS involved simple applications, such as Lane Departure Warnings. By 2014, the number and breadth of applications that could assist a driver avoid an accident was expanding dramatically. Mobileye had pioneered many of these applications (see Exhibit 2). Going forward, the enhanced processing power of the EyeQ chips and better software would bring the world closer and closer to the self-driving car. In fact, most of the enhanced capabilities were driven by software. This meant, according to Aviram, that ”it should be possible in the near future to put the core hardware capabilities into every car, and then update the car—much like we updated computers and phones.” Consumers were also becoming increasingly open to the idea of autonomous and driverless cars. A 2013 survey found that 57% of consumers worldwide would trust a driverless car, although only 46% would allow their children to ride in them.19 Beyond trust and comfort, price was a major barrier to adoption. Mobileye’s monocular cameras were the cheapest solution, which helped drive demand. A 2012 survey found that 37% of vehicle owners would be willing to purchase a fully automated car, but that figure dropped to 20% when respondents learned that the technology would cost an additional $3,000.20 10 This document is authorized for use only by Anand. Walser in 2015. For the exclusive use of A. Walser, 2015. Mobileye: The Future of Driverless Cars 715-421 Consumers had shown more willingness to adopt various driver assistance applications, marketed primarily as safety features, which could be added to vehicles for much lower cost. Safety had always been a major consideration in consumers’ automotive purchases and safety was the most important factor in influencing consumer adoption of autonomous driving and driver assistance technologies. Many of the driver assistance features were designed to address the most common sources of traffic fatalities. For instance, the National Highway Transportation Safety Administration (NHTSA) found in 2011 that 53% of traffic fatalities were caused by roadway departure and it was estimated that Lane Departure Warning and Lane Keeping Assistance technologies could prevent 7,500 traffic fatalities per year in the United States.21 Similarly, a 2012 NHTSA study found that 18 percent of all motor vehicle crashes and 10 percent of traffic fatalities in the U.S. were caused by distracted drivers, an issue that many ADAS features were designed to address.22 In addition to consumer demand for driver assistance systems, regulatory bodies and governmental agencies around the globe had begun to recommend and in many cases, mandate, the implementation of driver assistance systems as active safety features for new vehicles. Aviram believed that “western world governments would require ADAS in all cars within 3-4 years. Since our system is a 3rd eye, which dramatically reduces the probability of an accident, regulators see the logic.” Both Aviram and Shashua emphasized that Mobileye wanted to “solve accidents” and “save lives,” which was the focus of regulators. They were less interested in features that offered greater convenience, such as parking assistance. Shashua argued that growing regulation in favor of safety could be critical to Mobileye’s future: There is an evolutionary trend towards an increased role of regulation providing incentives (through star ratings) to have car models come with active safety technology, i.e., technology that helps drivers avoid accidents and/or supports automatic braking to avoid and mitigate the imminent collision. Starting 2011, the U.S. National Highway Safety Transportation Agency (NHSTA) has identified technologies that help drivers avoid a collision—specifically Lane Departure Warning (LDW) and Forward Collision Warning (FCW)—through their 5-star rating program. Since then, the availability of these features has experienced a dramatic increase. Starting 2014, the EU NCAP agency has identified Auto Emergency Braking (AEB) as a standard requirement for all car models with a 4 and 5-star rating. NHSTA is planning to come forward with similar requirements, the Japanese Ministry of Land Infrastructure Transport and Tourism (MLIT) has announced AEB mandates for 2016, and the Australian ANCAP has adopted similar measures since 2012. This trend is likely to expand further and engulf the automotive industry worldwide. At Mobileye, we have witnessed a sharp increase in demand for our technology over the last few years that is partially explained by increased driver acceptance and awareness of driver safety technologies and partially by the influence of regulators on OEM plans. The Coming Revolution: Autonomous Driving The excitement around Mobileye was driven partly by the anticipation of a self-driving car. Prototypes of self-driving cars had been around for a long time, but none captured the world’s attention like Google. By 2014, Google’s test vehicles had logged over 500,000 miles, and co-founder Sergey Brin predicted that self-driving cars would become commercially available by 2017. Google also introduced a prototype of a self-driving vehicle in 2014 that it had built from scratch, which had neither a steering wheel nor gas and brake pedals. Google’s effort had attracted the most attention, but there were a number of other efforts underway to develop self-driving vehicles. The Defense This document is authorized for use only by Anand. Walser in 2015. 11 For the exclusive use of A. Walser, 2015. 715-421 Mobileye: The Future of Driverless Cars Advanced Research Projects Agency (DARPA) sponsored a series of challenges in the mid-2000s that gave a boost to the development of self-driving cars. In late 2013, a team of researchers from Carnegie-Mellon, who had won the DARPA prize in 2007, announced a successful road test of a self- driving car in highway, suburban, and urban settings. As the list of successful test prototypes grew, industry observers predicted that highly autonomous driving would be widely available by 2020, with a truly self-driving vehicle coming some time later.23 Proponents of autonomous driving pointed to a variety of benefits, safety above all. Autonomous driving advocates were working toward, in the words of a Bosch executive, “a vision of collision-free driving.”24 In 2012, an average of nearly 1.3 million people were killed every year in auto accidents and, according to the World Health Organization, that number would climb to over 1.8 million by 2020. Driver error (including distraction, impairment or drowsiness) caused over 90% of accidents. Eliminating the human factor through automated driving had the potential to reduce traffic fatalities dramatically.25 Aviram noted that “92% of accidents are due to driver failure. If you have a fly inside your car, and you try to kill it, you are 9 times more likely to have an accident. Mobileye can both warn the driver, as well as keep the car on track.” There were other potential benefits in addition to safety. Autonomous cars could reduce congestion and improve highway capacity by traveling more closely together and coordinating movements through intersections. A collision-free world would mean that automakers could eliminate steel bumpers and roll cages, making vehicles much lighter and more fuel efficient. Fuel efficiency could also be improved by the practice of “platooning,” in which cars traveled in tight formation, allowing them to draft off of the cars in front of them and reduce drag. A final potential benefit would be the convenience and increased productivity that would arise as commuters would be able to safely engage in other tasks and activities while their vehicles handled driving tasks. If safety regulations were helping to drive the adoption of autonomous driving technologies, legal ambiguities surrounding self-driving cars represented a major hindrance to their adoption. Questions surrounded the legal definition of “driver.” For example, was the person in the driver’s seat of a self- driving car the driver? Governments and insurance companies would have to establish guidelines to determine liability in the case of an accident and responsibility for legal infractions, such as speeding. In addition, legislative and regulatory bodies would need to define standards for licensing drivers of automated vehicles and establish standards for testing self-driving cars.26 Deciding Mobileye’s Role in the Coming Revolution From Shashua and Aviram’s perspective, the future seemed incredibly bright. The self-driving revolution would create a never ending demand for new applications, which would grow the business exponentially over the next decade (see Exhibit 11 for Mobileye’s Vision of Steps for the Self-Driving Car). Yet success brought a growing number of questions on the Chairman’s and CEO’s agenda, such as how can they best take advantage of their healthy market capitalization? Should they consider branding Mobileye’s technology? How aggressively should they pursue cutting edge technologies, such as vehicle-to-vehicle communication, which could dramatically improve autonomous driving functions? How should they manage pricing going forward, as their systems become ubiquitous? And how do they play (or not play) with Google and its efforts to push forward the self-driving car? These last two questions had immediate relevance. Pricing had always been easy for Mobileye. Aviram firmly believed in what he called “disruptive strategic pricing.” On one hand, Mobileye had relatively little direct competition. With its technological lead, Aviram believed that he could easily 12 This document is authorized for use only by Anand. Walser in 2015. For the exclusive use of A. Walser, 2015. Mobileye: The Future of Driverless Cars 715-421 charge $100, on average, for the chip and software. Since analysts estimated that the chips cost roughly $10-$13 to produce, and software had a zero variable cost, the margins would be extraordinarily high, especially for the automobile industry. On the other hand, a $100 price would translate into roughly $400-$500 in added cost to the consumer, which would limit end-user demand and OEM adoption. Therefore, Aviram and Shashua decided very early in the game to charge around $45-55, which bundled a suite of applications (usually Lane Departure Warning, Autonomous Emergency Braking, and a few other features) and added about $300 to the price of a car. For the most part, Mobileye technology went into premium cars, where a $300 consumer premium was reasonable. Aviram argued that aggressive pricing combined with never discounting had two powerful advantages: Mobileye could still make great margins, and competitors would lose money if they matched the price. Over time, management hoped to raise average selling prices as the company added more and more sophisticated applications. By the end of 2014, Mobileye had different bundles that ranged in price from $40 to $90 and up to $180, depending on the number of advanced autonomous driving features. This strategy was relatively easy to sell for higher priced cars. Yet as demand for Mobileye solutions started to explode, some OEMs wanted to deploy simple ADAS systems into low-end cars, and they did not want to pay an average $45 price. One of the big American OEMs argued that a competitor’s camera may not be as good as Mobileye, but it was “good enough.” Mobileye should charge like Tier 1 suppliers, on a cost plus basis (implying ~$20). Competitors were also trying to position Mobileye as the ‘expensive’ option. But Aviram strongly preferred selling the complete bundle at the full price. And Shashua noted, “good enough…is of course foolish because Autonomous Emergency Braking is a safety feature (the car actuates brakes) and there is no such thing as good enough – it should be flawless.” Despite their confidence in their product, Shashua and Aviram sat around the table debating the question: should we give up some market share at the lower end of the market to hold prices? Should we let competitors get a foothold at the low end? Should we offer a discount for the low-end bundles, which lowers average selling prices for the company as a whole (and might have an impact on valuation)? If we start to discount, can we prevent broader price erosion? The longer term challenge posed by Google was also coming to the forefront. As Shashua prepared for his presentation at Google’s offices in Mountain View, he and Aviram discussed Google’s role in the self-driving car, the similarities and differences with Mobileye’s approach, and whether there was space for cooperation? Neither Shashua nor Aviram believed that Google’s prototype was a model for future cars. Even the most forward looking car companies, such as Tesla, were working with Mobileye rather than Google. The puzzle was trying to figure out Google’s real intentions. OEMs assumed that the technology for a self-driving car could not cost more than $1,000, which would lead to a $3,000-$5,000 retail premium. Since Google’s total cost was closer to $170,000, Shashua and Aviram were skeptical that it could ever be commercialized. Moreover, there were two paradigms for self-driving cars, according to Shashua: “’store and align,’ which was Google’s strategy; and ‘sense and understand,’ which was Mobileye’s strategy.” Google constructed high definition maps, and used sensors to align with the maps. Google would do 3D recordings, driving multiple times along the same routes, and then use the car’s sensors to align with the pre-stored description. Taking this approach globally was not practical, even if it was manageable in Mountain View, California. This document is authorized for use only by Anand. Walser in 2015. 13 For the exclusive use of A. Walser, 2015. 715-421 Mobileye: The Future of Driverless Cars Mobileye’s approach was fundamentally different: Shashua explained that “we use sensors to get data, interpret the data with our software, and use machine learning techniques to allow the car to adapt to the scene.” While Google was arguing for a huge leap into the future, Mobileye was suggesting a number of smaller leaps, which would likely take a decade. Most forward-looking OEMs were aligned with Mobileye’s future roadmap: In October 2014, Elon Musk pronounced Tesla would lead in autonomous driving, installing Mobileye’s camera and its most advanced functions in all new cars. Forthcoming, Tesla Model S would also come with a front radar (fusion) and sonar (sound waves) for parking sensors to allow very close proximity detection. Many other OEMs were expected to follow by 2018. Under these conditions, Shashua and Aviram discussed how they might connect with Google. Should Shashua ‘open the kimono’ to Google on his forthcoming trip? Should Aviram try to make Google a Mobileye customer, and allow Google to develop a deeper understanding of how it works? Should they explore cooperation, like other OEMs, or keep them at bay? 14
HARVARD REFERENCING STYLE
|
No comments:
Post a Comment