Successful people solve problems. Look at any great fortune, whether it be Carnegie, Ford or Gates and you find that the source of their vast accomplishment was a problem solved. Even more prosaic executives spend most of their time solving one problem or another, with greater or lesser skill.
The contrast in outcomes can be attributed to the scale and difficulty of the problems they tackled. All too often, we get so mired down in day-to-day challenges that the bigger issues fall by the wayside, being left for another day which never seems to come. That, in the final analysis, is the difference between the mundane and the sublime.
So we should pay special attention to those whose ideas had impact far beyond their own lifespan. It is they who were able to see not only the problems of their day but ones that, although they seemed minor or trivial at the time, would become consequential—even determinant—in years to come. Here are four such men and what we can learn from them.
Vannevar Bush and the Emerging Frontier of Science
By any measure, Vannevar Bush was a man of immense accomplishment. A professor at MIT who invented one of the first working computers, he also co-founded Raytheon, a $30 billion dollar company that prospers to this day.
Yet even these outsized achievements pale in comparison to how Bush fundamentally changed the relationship of science to greater society. In the late 1930’s, as the winds of war began to stir in Europe, Bush saw that the coming conflict would not be won by bullets and bombs alone. Science, he saw, would likely tip the balance between victory and defeat.
It was that insight which led to the establishment of Office of Scientific Research and Development (OSRD). With Bush at its helm, the agency led the development of the proximity fuze, guided missiles, radar, more advanced battlefield medicine and, not least of all, the Manhattan Project which led to the atomic bomb.
As the war came to a close, President Roosevelt asked Bush to write a report on how the success of the OSRD could be replicated in peacetime. That report, Science: The Endless Frontier, outlined a new vision of the relationship between public and private investment, with government expanding scientific horizons and industry developing new applications.
He wrote:
Basic research leads to new knowledge. It provides scientific capital. It creates the fund from which the practical applications of knowledge must be drawn. New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science
Bush’s report led to the foundation of the NSF, NIH, DARPA and other agencies, which have funded early research in everything from the Internet and GPS, to the Human Genome Project and many of our most important cures. It has been Bush’s vision, perhaps more than almost anything else, that has made America an exceptional nation.
Oh, and he also wrote an essay in 1945 that not only laid out what would become the Internet, but influenced many of the key pioneers who designed it.
Marshall McLuhan and the Global Village
Where Vannevar Bush saw the transformative potential of science, Marshall McLuhan was one of the first to see the subtle, but undeniable influence of popular culture. While many at the time thought of mass media as merely the flotsam and jetsam of the modern age, he saw that the study of things like newspapers, radio and TV could yield important insights.
Central to his ideas about culture was his concept of media as “extensions of man.” Following this line of thought, he argued that Gutenberg’s printing press not only played a role in spreading information but also in shaping human thought. Essentially, the medium is the message. Interestingly, these ideas led him to very much the same place as Bush.
As he wrote in 1962*, nearly 30 years before the invention of the World Wide Web:
The next medium, whatever it is—it may be the extension of consciousness—will include television as its content, not as its environment, and will transform television into an art form. A computer as a research and communication instrument could enhance retrieval, obsolesce mass library organization, retrieve the individual’s encyclopedic function and flip into a private line to speedily tailored data of a saleable kind.
McLuhan argued further that the new age of electronic media would disrupt the private experience and specialization that the dominance of printed media brought about and usher in a new era of collective, transnational experience that he called the global village. Anybody who watches global news networks or surfs the Web can see what he meant.
Importantly, however, he did not see the global village as a peaceful place. Rather than promoting widespread harmony and understanding, he predicted that the ability to share experiences across vast chasms of time and space would lead to a new form of tribalism, a result in a “release of human power and aggressive violence” greater than ever in history.
It has become all too clear what he meant by that as well.
Richard Feynman Sees “Plenty of Room at the Bottom”
When Richard Feynman stepped up to the podium to address the American Physical Society in 1959, he had already gained a reputation as both an accomplished scientist and an iconoclast (during his tenure at the the Manhattan project, he became famous for his safecracking and other pranks).
His talk, modestly titled There’s Plenty of Room At The Bottom would launch a revolution in physics and engineering that continues to play out to this day. Starting from a seemingly innocent question about shrinking an encyclopedia down to the size of a postage stamp, he proceeded over the next hour to invent the new field of nanotechnology.
The talk, which is surprisingly easy and fun to read, also gives a fascinating window into how a genius thinks. After pondering the problem of shrinking things down to the size of molecules, he proposes some solutions, then thinks some more about what issues those ideas would create, proposes some more fixes and on and on until a full picture emerged.
One of the most astounding things about Feynman is that his creation of nanotechnology was not a one-off, but part of a larger trend. He was also a pioneer in parallel computing and did important work in virology. All of this in addition to his day job as a physicist, for which he won the Nobel prize in 1965.
Tim Berners-Lee Creates a Web of Data
Tim Berners-Lee is most famous for his creation of the World Wide Web. In November 1989, he created the three protocols—HTTP, URL, and HTML—that we now know as the “Web” and released his creation to the world, refusing to patent it. Later, he helped set up the W3C consortium that continues to govern and manage its growth and further development.
The truth is, however, that the Web wasn’t a product of any great vision, but rather a solution to a particular problem that he encountered at CERN. Physicists would come there from all over the world, work for a period of time and then leave. Unfortunately, they recorded their work in a labyrinth of different platforms and protocols that didn’t work well together.
So Berners-Lee set out to solve that problem by creating a universal medium that could link information together. He never dreamed it would grow into what it did. If he had, he would have built it differently. He wrote at length about these frustrations in his memoir, Weaving The Web. Chief among them was the fact that while the Web-connected people, it did little for data.
So he envisioned a second web, which he called the Semantic Web. Much like his earlier creation, the idea outstripped even what he imagined for it. New protocols, such as Hadoop and Spark, have made data central to how today’s technology functions. Increasingly, we’re living in a semantic economy, where information knows no bounds and everything connects.
The Best Way to Predict the Future is to Create it
Take a hard look at these four visionaries and some common themes emerge. First, all except McLuhan took an active role in bringing their ideas into realities. Bush played a central role in implementing the scientific architecture he designed. Feynman offered prizes for people who could make things at nanoscale and Berners-Lee continues to take an active role at W3C.
Another commonality is that, while their ideas didn’t meet with immediate acceptance, they stuck with them. McLuhan’s ideas made him an outcast for much of his career until he became an international celebrity in his fifties. Berners-Lee created the Web partly out of frustration after the hypertext community wouldn’t pursue it. Bush and Feynman met less resistance but were already prominent in their fields.
Probably most importantly, none of them were following trends. Rather, they set out to uncover fundamental forces. It was that quest for basic understanding that led them to ask questions and find answers that nobody else could imagine at the time. They weren’t just looking to solve the problems of their day but sought out problems that transcended time.
In effect, they were able to see the future because they cared about it. Their motivation wasn’t to beat the market, impress a client or attract funding for a startup, but to understand more about how the universe functions and what could be made possible. In doing so, they helped us see it too so that we could also join in and make the world a better place.