Regulating Genomic Research: Top-Down Or Bottom-Up?

Commentary by John R. Graham

Source: Forbes

The next frontier in information technology is genomic sequencing, which will create the biggest of big data resources by 2025, according to experts in the field. It has been 15 years since President Clinton announced the first sequencing of the human genome; and it is now clear that researchers’ ability to free the unimaginable wealth of information locked inside our genomes is bumping up against constraints imposed largely by the federal government.

This is the topic of an impressive white paper co-published by the Health IT Now Coalition and the Center for Data Innovation at the Information Technology & Innovation Foundation. From Evolution to Revolution: Building the 21st Century Genomic Infrastructure clearly identifies the problems and describes a path forward – all in in seven pages accessible to the non-scientific reader. The paper supports President Obama’s goal of having at least one million volunteers contribute their genetic data to a large pool for research purposes, as stated in the president’s recently announced precision medicine initiative.

This review summarizes the recommendations and suggests a possible way to push the boat out a little farther, by challenging a closely held assumption about the role of the federal government in regulating medical research.

The three recommendations are:

  1. Improve interoperability and data sharing by strengthening federal requirements for health data and “bottom-up, patient-driven” reforms that give patients more control over sharing their health data.
  2. Engage patients in the dialogue about genomic research.
  3. Re-think privacy law, especially the Health Information Portability & Accountability Act (HIPAA) and the Common Rule governing research on human subjects, especially the definition of “informed consent.” Both laws date back to the 1990s, and are long since due for updating.

The second recommendation is not really debatable. The overlap between the first and third is, I believe, so large that they can be discussed as parts of the same problem. At a panel discussion launching the white paper held in Washington, DC on July 23, academic researchers discussed the constraints on research imposed by the Common Rule.

According to the panel, Institutional Review Boards (which approve research involving human subjects) are wary of approving consent forms that ask subjects to allow data to be used for more purposes than those defined in advance for one specific project, fearing that such consent would not be seen by authorities as truly informed. This is not good enough for genomic research, where databases can be investigated again and again to answer new research questions. The paper invites Congress to revisit the Common Rule and HIPAA, even suggesting (but not insisting upon) a stretch goal of a universal consent form.

Similarly, the white paper invites Congress to impose stronger requirements on the interoperability of current health data. Interoperability is happening within some institutions. A decade ago, the University of Pennsylvania Health System began to invite patients to contribute their data to its genome bank. Research is facilitated by all physicians in the system using the same Electronic Medical Record (EMR). However, the lack of interoperability across – and even within – many systems has hindered researchers’ ability to exchange data.

The white paper suggests Congress would increase federal requirements on regulated institutions and researchers. For my money, the “bottom-up” approach is more likely to succeed. Indeed, revisiting relevant federal laws should include questioning the hitherto unchallenged assumption that the federal government is the appropriate locus of regulatory power. Progress may be quicker if Congress allows states to take the lead.

Take, for example, interoperability. The president’s budget includes $5 million to support the development of interoperability standards and requirements. Unfortunately, the federal government has already burned through almost $30 billion in the last five years to induce hospitals and doctors to implement Electronic Medical Records (EMRs) that are interoperable. It is widely accepted that this money was wasted and has actually hindered interoperability. Another $5 million is unlikely to drag the federal mandate across the finish line.

However, there are success stories within states. Take for example, an effort described by Dr. Toby Bloom, PhD, of the New York Genome Center. Under the auspices of one Institutional Review Board, six New York hospitals, and physician practices, have agreed to pool de-identified patient data for research purposes. The collaboration has the ability to remove duplicate records and hopes to merge genomic data into the database. It currently has data on five million patients and will eventually have ten million patients’ records, according to Dr. Bloom. That would amount to half the state’s population – far in excess of President Obama’s goal of one million!

If the state were the locus of oversight, researchers, patients and other interested parties might update the regulatory apparatus faster than waiting for Congress to act.

Investors’ Note: Intel INTC +0.00% (NASDAQ: INTC), Qualcomm QCOM +1.85% (NASDAQ: QCOM), athenaHealth (NASDAQ: ATHN), and McKesson (NYSE: MCK) are among the members of the groups that sponsored the white paper.

John R. Graham is a Senior Fellow at the National Center for Policy Analysis and Co-Organizer of the Health Technology Forum: DC. His articles are collected at JRG Health Sector Analysis.