Designing an airplane has long been an exercise in tradeoffs. A larger airplane with more powerful engines can hold more people and go farther, but is costlier to run. A smaller one is more cost efficient, but lacks capacity. For decades, these have been nearly inviolable constraints that manufacturers just had to live with.
Boeing’s new 787 Dreamliner is different. It didn’t just redesign the airplane, it redesigned the materials that go in it. Because of its use of composite materials that are lighter and stronger than traditional metals, the company was able to build an aircraft that is 20% more efficient, but sacrifices nothing in terms of capacity or performance.
Typically, this has been a game that only a multibillion dollar corporation can play. The cost of developing and testing new materials costs millions of dollars a year, with no guarantee of any return on investment. Yet now, that’s starting to change. Big data and machine learning are revolutionizing the science of making things and will make it available to the masses.
When Cancer Becomes Data
Typically, cancer research has been approached according to the traditional scientific method. A scientist would come up with a theory about how a particular mutation causes a tumor, apply for a grant, perform a study and then publish results. But in 2005, researchers at the National Cancer Institute (NCI), saw an opportunity to go in another direction.
“We said, ‘Let’s gather data along with some basic analysis, publish it and allow the scientific community to study it,’” Jean Claude Zenklusen a biologist at NCI told me. “We did this because we believed by releasing the data in this way, we could tap into the collective expertise of thousands of researchers across a number of fields and accelerate innovation.”
This approach formed the basis for The Cancer Genome Atlas (TCGA), a joint project between NCI and the National Human Genome Research Institute, which began in 2006. It has since sequenced the tumors of over 10,000 patients encompassing 33 types of cancer. “Cancer data has now become open data,” Zenklusen told me proudly.
The success of the program turned traditional science on its head. It showed that you no longer need to spend years testing a hypothesis to do science effectively. There is already an abundance of scientific data gathered from endless experiments and most of it is discarded. New data techniques, however, now allow us to make use of it.
A Materials Genome
When you think about it, a manufacturing process is much like a genome. DNA codes for proteins that determine how stuff in biological organisms gets built. If you have one set of proteins, you might have brown eyes. If you have another, you might have blue. The same goes for our blood type, our cellular process and virtually every aspect about us.
The materials that go into manufactured products work in a similar way. The initial components, like metals, chemicals or organic materials go through a particular process that gives the material certain properties, like density, tensile strength, electrical conductivity and many other things. These properties then determine a product’s performance.
Historically, developing new materials has been a slow and expensive process because while there are countless materials and ways to process them, identifying one that will give you the properties you want is like looking for a very small needle in an almost infinitely large haystack. That’s why only massive companies like Boeing can afford to do continuous materials research.
However, now there is an effort to change that. Much like The Cancer Genome Atlas, the Materials Genome Initiative (MGI) collects data on thousands of materials. So research done to test materials for one purpose, can be used to provide insights on how to deploy that information for something else altogether.
For example, scientists at the Joint Center for Energy Storage Research (JCESR) at Argonne National Laboratory are currently testing thousands of compounds in their quest to make next generation batteries. Very few will be of use to them, but by including their work in the MGI data set, others can unlock value from it.
Big Data And The Democratization of Exploratory Research
The idea of collecting information in a central repository is nothing new, but what makes programs like The Cancer Genome Atlas and the Materials Genome Initiative profoundly different is that their data is machine readable. So instead of a human scientist spending years poring through hundreds of tables and charts, a computer can analyze millions in a small fraction of that time.
Citrine Informatics, a company founded by three materials scientists, is taking the next logical step by applying advanced machine learning algorithms to materials data. “We want to do for manufacturing and materials what Microsoft did for offices, enable companies of all sizes to do high quality work, much faster and more productively than they ever thought they could before,” Greg Mulholland, the firm’s CEO told me.
A similar revolution is underway in genomics. As researchers learn more about the genetic code and put that data online, a new technique, called CRISPR, is enabling even amateur scientists to edit genes at will. The procedure, discovered in 2012, has already resulted in a number of breakthroughs.
What’s essential to understand about these new developments is that you don’t need a multimillion dollar lab to make use of them. Today, just about anybody can access this knowledge and, with machine learning algorithms becoming increasingly ubiquitous, will be able to create enormous value with it.
The Mindset Shift
Much like the Internet democratized knowledge — a teenager with a smartphone today has more access to information than a specialist at a large institution a generation ago — big data and machine learning will bring scientific understanding to the masses. In the years to come, that will revolutionize how we make things, especially for small and medium sized firms.
Up till now, to come up with a breakthrough you needed use the scientific method of hypothesis, experimentation and verification, which although effective, is incredibly tedious and costly. Now, however, with experimental results increasingly available to anyone with an Internet connection, that’s beginning to change.
Today, there are a number of ways for even small companies to access world class research, from government programs to industry consortiums to collaborations with local universities and, much like Boeing, they can use that knowledge to create revolutionary products. In fact, it is likely that anybody who doesn’t will find it difficult to compete in the future.
However, every shift in capabilities requires a shift in mindset. We can no longer think about the process of discovery in the traditional way in which we meticulously plod along until we hit on something big. Rather, we need to think about exploration itself as a competitive advantage. Not all who wander are lost. The trick is to learn to wander with purpose.
An earlier version of this article first appeared in Inc.com
Wait! Before you go…
Choose how you want the latest innovation content delivered to you:
- Daily — RSS Feed — Email — Twitter — Facebook — Linkedin Today
- Weekly — Email Newsletter — Free Magazine — Linkedin Group
Greg Satell is a popular speaker and consultant. His first book, Mapping Innovation: A Playbook for Navigating a Disruptive Age, was selected as one of the best business books in 2017. Follow his blog at Digital Tonto or on Twitter @Digital Tonto.