The Trillion Dollar Question

Roger Peng
2018-08-09

Recently, Apple’s stock price rose to the point where the company’s market valuation was above $1 trillion, the first U.S. company to reach that benchmark. Subsequently, numerous articles were published describing Apple’s journey to this point and why it got there. Most people describe Apple as a technology company. They make technology products: iPhones, iPads, Macs, etc. These are all computing devices. But there is another way to think of Apple and what kind of company they are as well as how they became so successful.

Neil Cybart, an analyst over at Above Avalon, likes to describe Apple as a design company focused on building useful tools for people. Of the latest round of profiles on Apple reaching a $1 trillion market valuation, he writes:

Despite supposedly being about chronicling how Apple went from near financial collapse in the late 1990s to a trillion-dollar market cap, a number of articles did not include any mention of Jony Ive [Apple’s Chief Design Officer], or even design for that matter. To not include Jony Ive in an article about Apple’s last 20 years is unfathomable, demonstrating a clear misunderstanding of Apple’s culture and the actual reasons that contributed to Apple’s success. Simply put, such profiles failed in their pursuit of describing Apple’s journey to a trillion dollars. Apple is where it is today because of design – placing an emphasis on how Apple products are used. Every other item or variable is secondary. [emphasis added]

As long as I have followed computers people have complained that Apple’s hardware is substandard. Other companies like Dell, Gateway, Acer, and Lenovo, had long been making computers that were “better” than Apple’s hardware. Apple’s value has always been selling good hardware coupled with premium software. But for a long time that was not appreciated by the market and Apple almost went bankrupt as a result.

The “Speeds and Feeds” Era for Data Analysis

When I was growing up, computers were all about so-called “speeds and feeds”. The only things people talked about were the megahertz of their processor or how many megabytes of RAM a computer had. A computer with a higher megahertz CPU was by definition better than a computer with a lower megahertz CPU. More RAM was better than less RAM and more disk space was better than less disk space. It was easy to compare different computers because we had quantitative metrics to go by. The hardware itself was a commodity and discussion about software was nonexistent because every computer ran the same software: Windows.

We are very much in the “speeds and feeds” era for data analysis right now. There is tremendous focus on and fascination with the tools and machinery underlying data analysis. Deep learning is only one such example, along with an array of related machine learning tools. Web sites like Kaggle promote a culture of “performance” where the person who can cobble together the most accurate algorithm is a winner. It’s easy to compare different algorithms to each other because there is often a single metric of performance that we can easily agree to compare.

Serious investment is being made in improving algorithms to make them more accurate, efficient, and powerful. We need these algorithms to be better so that we can have self-driving cars, intelligent assistants, fraud detection, and music discovery. Even the hardware itself is being optimized to improve the performance of these specific algorithms. This is the call of “more gigahertz, more RAM, more disk space” of our time. As easy hardware wins are fading into the past (as shown by Intel’s struggle), the focus is on improving the performance of machine learning software running on top of it.

All of this is necessary if we want to reap the benefits of machine learning algorithms in our daily lives. But if the computing industry has anything to teach the data science industry, it’s that perhaps the more interesting stuff is yet to come. Furthermore, it suggests that the companies (and perhaps individuals) with the best speeds and feeds will not necessarily be the winners.

What Comes Next?

Today, it could be argued that the most profitable “computer” in the world is the iPhone, which to be sure, has better “speeds and feeds” than any computer from my childhood. But it is by no means the fastest computer today. Nor does it have the most RAM, the most disk space, or the best graphics. How can that be?

Of course, the focus of computing changed from desktop to laptop to mobile, in part due to the great advancement in chip technology and miniaturization. So the benefit was not in greater speeds and feeds, but rather in smaller sizes for the same speeds and feeds. With these smaller, more personal, devices, the software and the design of the system became of greater importance. People were not using these devices to “crunch numbers” or do complex, but highly specialized, tasks. Rather, they were using them to do everyday tasks, like checking email, surfing the web, and communicating with friends. These were not business machines; they were for the mass market.

Arguable, the emphasis that Apple places on design has made it the most successful computer company of today because design is what creates the best user experience today in the mass market. Data science remains a niche area of work today even though its popularity and application has exploded over just a few years. It’s difficult for me to see how it might move into a mass market position, but I can see more and more people doing and consuming data analysis in the future. As the population of data analysis consumers grows, I think people will become less focused on accuracy and prediction metrics and more focused on whether a given analysis achieves a specified goal. In other words, data analyses will have to be designed to accomplish a certain task. The better individuals are at designing good data analyses, the more successful they will be.