Conventional methods are also of limited help in overcoming one of the greatest obstacles to precision medicine, and that’s data sharing. The puzzles of precision medicine cannot be solved by a single government agency, life sciences company, or university. It requires what might be called a “megacommunity,” with all of government, business, and society pooling their data—and often their expertise—in a collaborative effort.
And yet traditional approaches to data cannot provide the necessary security and privacy protections that each stakeholder needs. Once again, precision medicine hits a brick wall.
Still another challenge is that with conventional methods, researchers without specialized training in IT don’t have direct access to the data. They must go through data scientists and other IT specialists in a laborious process to ask questions and get back answers. But with the explosion of data and analytics in virtually every corner of society, these IT experts are in high demand and short supply. There are simply not enough of them available to help researchers explore many of the most promising avenues of precision medicine.
The ultimate problem is that current computing approaches weren’t built for big data—they were developed when data was “smaller” and more manageable. The traditional methods worked well for many years, but they are not up to the task of addressing big-data challenges like precision medicine.