Announcement

Collapse
No announcement yet.

Foundation Medicine and the Big Barrier to Cancer Genomic Sequencing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Foundation Medicine and the Big Barrier to Cancer Genomic Sequencing

    According to Dr. Eric Topol, Director, Scripps Translational Science Institute, recent studies have highlighted the potential value of whole genome or exome sequencing to precisely guide therapy for patients with cancer. However, almost all samples today go into formalin-fixed, paraffin embedded (FFPE) blocks, which alters the DNA and makes sequencing quite compromised and difficult.

    He told Medscape Connect that the company, Foundation Medicine, which works with formalin-fixed paraffin-embedded (FFPE) blocks, and gets about 250-300 genes, the exons or coding elements in those genes, and reads out any potential links to drugs. But the rate-liimiting step appears to be getting something beyond these paraffin blocks. This is, we could do better if we could use either fresh formalin-fixed or frozen tissue samples from a biopsy or surgical specimen.

    Topol says the problem is that pathologists are seemingly quite ritualistic. They don't want to go to frozen samples, which would be the best for whole genome sequencing. We're just at the cusp of getting started with this type of limited, not even full exome sequencing, just a few hundred genes, but that isn't enough.

    Rencent papers in multiple journals in Nature, Science, Nature Genetics and Cell have shown that with hundreds of tumor samples fully sequenced, no two cancers are the same and a lot of the action is not in the coding elements of the genes per se. Whole genome sequencing certainly appears to be an ideal path to pursue, but we can't do it with the fixed problems that we have with the way samples are handled today.

    Topol thinks that maybe we could get fresh formalin-fixed samples, as those appear to be well-suited to whole genome sequencing, although this is still a somewhat bootstrapped situation, like the paraffin-embedded samples. It appears that the long those samples are embedded, the harder it is to get a reasonable sequence beyond very targeted regions.

    There are no two cancer tissues that are the same on a molecular basis. There's quite a bit of heterogeneity within the samples and multiple sequencing could account for that. And we also want to anticipate recurrence, match up the right driver mutations and the backseat passenger mutations, whether or not there's needed immunotherapy; all those things that could be done if we could get the right information from the get go.

    So Dr. Topol asks this: How are we going to move to a world with a clinic of the future where patients with cancer can get whole genome sequencing rapidly? That is, to have annotation and interpretation of the genome with a day, and have your therapy precisely guided genomically?
    Gregory D. Pawelski

    #2
    Gene Sequencing for Drug Selection?

    Researchers have realized that cancer biology is driven by signaling pathways. Cells speak to each other and the messages they send are interpreted via intracellular pathways known as signal transduction. Many of these pathways are activated or deactivated by phosphorylations on select cellular proteins.

    Sequencing the genome of cancer cells is explicitly based upon the assumption that the pathways - network of genes - of tumor cells can be known in sufficient detail to control cancer. Each cancer cell can be different and the cancer cells that are present change and evolve with time.

    Although the theory behind inhibitor targeted therapy is appealing, the reality is more complex. Cancer cells often have many mutations in many different pathways, so even if one route is shut down by a targted treatment, the cancer cell may be able to use other routes.

    In other words, cancer cells have "backup systems" that allow them to survive. The result is that the drug does not affect the tumor as expected. The cancer state is typically characterized by a signaling process that is unregulated and in a continuous state of activation.

    In chemotherapy selection, molecular profiling examines a single process within the cell or a relatively small number of processes. All a gene mutation study can tell is whether or not the cells are potentially susceptible to a mechanism of attack. The aim is to tell if there is a theoretical predisposition to drug response.

    It doesn't tell you the effectiveness of one drug (or combination) or any other drug which may target this in the individual. There are many pathways to altered cellular function. Functional Profiling measures the end result of pathway activation or deactivation to predict whether patients will actually respond (clinical responders).

    It measures what happens at the end, rather than the status of the individual pathway, by assessing the activity of a drug (or combinations) upon combined effect of all cellular processes, using combined metabolic and morphologic endpoints, at the cell population level, measuring the interaction of the entire genome.

    Translational science: past, present, and future

    Only registered and activated users can see links., Click Here To Register...

    Note: Foundation Medicine is not any different than Caris Diagnostics in Phoenix (now Miraca Life Sciences), beyond testing for standard pathology "targets" such as ER, PR, Her2, EGFR mutations, KRAS, BRAF. They aren't worth much for the sorts of chemotherapy which is used in 95% of all cancers and useless with respect to drug combinations. While fresh tissue is very dear and hard to come by, function trumps structure, in terms of potency and robustness of information provided than using archival paraffin blocks.
    Gregory D. Pawelski

    Comment


      #3
      Targetable genetic alterations detected by next-generation sequencing

      Needle biopsies assay on 33 pretherapy primaries and 17 mBCs. NGS of 3,230 exons in 182 cancer-related genes and 37 introns in 14 genes. 117 "potentially" targetable driver mutations were identified (mean: 2.3 in primary tumors, 2.8 in mBCs); however, further research is needed.

      According to Dr. Neil Love of Research To Practice, one of several presentations at the ASCO trade show was in a variety of solid tumors on the now commercially available FoundationOne assay. This study of 50 patients with breast cancer — like similar reports in lung, prostate and colorectal cancer — documented that fine needle biopsies provided enough tissue to adequately perform the test. Even more relevant was that multiple potentially “targetable” mutations were found in all these specimens. Although the authors suggest that some of these targets may correlate with benefit from approved agents (eg, crizotinib for ALK translocations), and it might be tempting to consider ordering the assay for patients who have run out of conventional options, this concept has not yet been tested. In that regard, it is worth reflecting on the wholly disappointing experience with BRAF inhibitors for patients with V600E mutation-positive colorectal cancer and appreciating that the term “targetable” is highly theoretical at this point in time.

      Use of next-generation sequencing (NGS) to detect high frequency of targetable alterations in primary and metastatic breast cancer (MBC).

      Sub-category:
      Genomic and Epigenomic Biomarkers

      Category:
      Tumor Biology

      Meeting:
      2012 ASCO Annual Meeting

      Session Type and Session Title:
      General Poster Session, Tumor Biology

      Abstract No:
      10559

      Citation:
      J Clin Oncol 30, 2012 (suppl; abstr 10559)

      Author(s):

      Lajos Pusztai, Roman Yelensky, Bailiang Wang, Rony Avritscher, William Fraser Symmans, Doron Lipson, Gary A. Palmer, Stacy L. Moulder, Philip Stephens, Yun Wu, Maureen T. Cronin; University of Texas M. D. Anderson Cancer Center, Houston, TX; Foundation Medicine, Cambridge, MA; University of Texas, Houston, TX; Department of Radiology, University of Texas M. D. Anderson Cancer Center, Houston, TX

      Abstract:

      Background:

      The aim of this study was to assess the frequency of genomic alterations in breast cancer potentially treatable with approved targeted agents or investigational drugs in clinical trials. NGS was performed in a CLIA setting (Foundation Medicine).

      Methods:

      DNA was extracted from needle biopsies of 33 pre-therapy primary and 17 MBCs (mean age 52 yrs; 58% ER+, 20% HER2+, 30% triple negative) obtained prospectively for biomarker discovery and preserved in RNAlater. Patients with MBC received an average of 7 drugs (range 5-17) including adjuvant therapy before biopsy for this research; 13 biopsies were from soft tissues, 3 from liver and 1 from bone. Sequencing was targeted to 3230 exons in 182 cancer-related genes and 37 introns in 14 genes often rearranged in cancer. Average median depth was >1200x.

      Results:

      All biopsies yielded sufficient DNA. NGS revealed a total of 117 known driver mutations across 36 genes (per-tumor average=2.5, range 1-6), including 37 base substitutions (32%), 28 indels (24%), 42 amplifications (36%) and 10 homozygous deletions (9%). NGS identified HER2 gene amplification in 6/7 cases scored HER2+ by FISH. The average number of functionally important alterations was surprisingly similar, 2.3 in primaries vs 2.8 in heavily treated MBCs (p=0.32). Remarkably, 25/33 (76%) of primary and 14/17 (82%) of MBCs had at least 1 genomic alteration targetable with an FDA approved drug or novel agent in clinical trials. These included: ERBB2 alterations (n=9), PIK3CA mutations (n=8), NF1 mutations (n=4, candidate for PI3K/MEK inhibitors), AKT1-3 mutations (n=5, PI3K inhibitors), BRCA1/2, (n=6, PARP inhibitors), and CCND2 (n=3)/CDKN2A (n=3) mutations (CDK inhibitors). Numerous other alterations with less apparent therapeutic implications were also observed.

      Conclusions:

      Comprehensive NGS profiling in breast cancer needle biopsies showed high frequency of genomic alterations linked to a clinical treatment option or clinical trials of targeted therapies. These results demonstrate it is feasible to use NGS to guide targeted therapy. Prospective testing of the diagnostic/predictive value of this patient selection approach is currently under way.
      Gregory D. Pawelski

      Comment


        #4
        Accuracy and Clinical Utility of In Vitro Cytometric Profiling

        Accuracy and clinical utility of in vitro cytometric profiling to personalize chemotherapy: Preliminary findings of a systematic review and meta-analysis.

        Sub-category: Molecular Diagnostics and Imaging

        Category: Tumor Biology

        Meeting: 2013 ASCO Annual Meeting

        Abstract No: e22188

        Citation: J Clin Oncol 31, 2013 (suppl; abstr e22188)

        Author(s): Christian Apfel, Kimberly Souza, Cyrill Hornuss, Larry Weisenthal, Robert Alan Nagourney; SageMedic, Inc, Larkspur, CA; Ludwig Maximilians University of Munich, Munich, Germany; Weisenthal Cancer Group, Huntington Beach, CA; Rational Therapeutics, Long Beach, CA

        Abstract:

        Background:

        Cytometric analysis, or in-vitro functional profiling, has been developed as a method to predict tumor response to different drugs with the premise to personalize chemotherapy and improve patient outcomes.

        Methods:

        We performed a systematic review and a meta-analysis a) of correlative studies using cytometric profiling that reported diagnostic accuracy (sensitivity and specificity) and b) of effectiveness studies comparing patient outcomes when allocated to treatment guided by a cytometric assay versus population-based standard of care. We used Meta-DiSc software to find pooled sensitivity and specificity and analyze the summary receiver operating characteristic (sROC) curve and used Review Manager 5.1 to generate forest plots on overall tumor response (50% or greater decrease in tumor diameter) and on 1-year overall survival.

        Results:

        We included 28 mostly retrospective trials (n=664) reporting accuracy data and 15 prospective trials (n=1917) reporting therapeutic efficacy data. The accuracy of correlative study revealed an overall sensitivity of 0.922 (95% confidence interval 0.888 to 0.948), specificity of 0.724 (95% CI 0.669 to 0.774) and an area under the sROC curve of 0.893 (SE=0.023, p<0.001). Studies comparing the clinical utility revealed a two-fold overall tumor response for an assay-guided therapy versus standard of care therapy (odds ratio 2.04, 95% CI 1.62 to 2.57, p<0.001). Similarly, patients who received assay-guided therapy compared to those who received standard of care or physician’s choice had a significantly higher 1-year survival rate (OR 1.44, 95% CI 1.06 to 1.95, p=0.02).

        Conclusions:

        Despite various limitations of individual studies, the aggregate and fairly consistent evidence of these data suggests cytometric profiling to be accurate, to improve overall tumor response, and to increase 1-year patient survival. Given the enormous potential for our society, a well-designed and sufficiently-powered randomized controlled trial is urgently needed to validate these results.

        Only registered and activated users can see links., Click Here To Register...
        Gregory D. Pawelski

        Comment


          #5
          Venture Capital Goes Genomic

          Robert A. Nagourney, M.D.

          During the 1960s, 70s and into the 90s, a field of investigation arose that examined buyer’s practices when it came to the consumption of goods and services. Algorithms were developed to interrogate consumer choice. One such treatise was reported in 1994 (Carson, RT et al, Experimental Analysis of Choice, Marketing Letters 1994). What these researchers explored were the motivations and forces that drove consumption. When choices are offered, decisions are driven by such factors as complexity and utility. Complexity demands personal expertise or failing that, input from experts, while utility places a value on the good or service.

          A recent report from a small biotechnology company called Foundation Medicine has brought this field of endeavor to mind. It seems that this group will be offering DNA sequencing to select chemotherapy drugs. This service, currently priced at $5,800, will focus upon a small cassette of genes that they described as “key” in tumor growth. Based on their technology they have already raised $33.5 million from the likes of Third Rock, Google and Kleiner Perkins Caulfield & Byers, venture capital sources. The CEO of Foundation substantiates the approach by pointing out that fully 150 people have already used their services. One hundred and fifty!

          It seems from this report that our colleagues in the field of molecular profiling have studied the dictates of “Experimental Analysis of Choice” to a “T.” What we have is the perfect storm of medical marketing.

          First, the technology is so complex as to be beyond the ken of both patients and physicians alike. Thus, expertise is required and that expertise is provided by those engaged in the field. Second, the utility of drug selection is beyond reproach. Who in their right mind wouldn’t want to receive a drug with a higher likelihood of a response when we consider the toxicities and costs, as well as the consequences of the wrong treatment? Dazzled by the prospect of curative outcomes, patients will, no doubt, be lining up around the block.

          But, let’s deconstruct what this report is actually telling us. First, a scientifically interesting technology has been brought to the market. Second, it exists to meet an unmet need. So far, so good. What is lacking, however, is evidence. Not necessarily evidence in the rarefied Cochrane sense of idealized survival curves, nor even Level II evidence, but any evidence at all. Like whirling dervishes, patients and their physicians are drawn into a trancelike state, when terms like NextGen sequencing, SNP analysis and splice variants are bandied about.

          Despite the enthusiastic reception by investors, I fear a lack of competent due diligence. To wit, a recent article in Biotechniques, “Will the Real Cancer Cell Please Stand Up,” comes to mind. It seems that cancer cells are not individual entities but networks. A harmonic oscillation develops between tumor, stroma, vasculature and cytokines. In this mix, the cancer cell is but one piece of the puzzle.

          Indeed, according to recent work from Baylor, some of the tumor promotion signals in the form of small interfering RNAs, may arise not from the cancer cells, but instead from the surrounding stroma. How then, will even the most punctiliously perfect genomic analyses of a cancer cells play out in the real world of human tumor biology and clinical response prediction? Not very well I fear. But then again such a discussion would require data on the predictive validity of the method, something that appears to be sorely lacking.

          Will today’s gene profile companies prove to be the biotech Facebook IPOs of tomorrow?
          Gregory D. Pawelski

          Comment


            #6
            There is the issue about cell-lines vs fresh cells. Cell-lines have always played, and continue to play, an important role in drug screening and drug development.

            The problem is that cell-lines do not predict for disease or patient specific drug effects. If you can kill cancer cell-lines with a given drug, it doesn’t tell you anything about how the drug will work in real world, clinical cancer (real-world conditions). But you can learn certain things about general drug biology through the study of cell-lines.

            As a general rule, studies from established cell-lines (tumor cells that are cultured and maniplated so that they continue to divide) have proved worthless as models to predict the activity of drugs in cancer. They are more misleading than helpful. An established cell-line is not reflective of the behavior of the fresh tumor samples (live samples derived from tumors) in primary culture, much less in the patient.

            Established cell-lines have been a huge disappointment over the decades, with respect to their ability to correctly model the disease-specific activity of new drugs. What works in cell-lines do not often translate into human beings. You get different results when you test passaged cells compared to primary, fresh tumor.

            Research on cell-lines is cheap compared to clinical trials on humans. One gets more accurate information when using intact RNA isolated from “fresh” tissue than from using degraded RNA, which is present in paraffin-fixed tissue.

            My question would be, do you want to utilize your tissue specimen for “drug selection” against “your” individual cancer cells or for mutation identification, to see if you are “potentially” susceptible to a certain mechanism of attack?

            Cell Lines vs Fresh Cells

            Only registered and activated users can see links., Click Here To Register...
            Gregory D. Pawelski

            Comment


              #7
              Is Genomic Sequencing Ready for Prime Time in Drug Selection?

              Next-generation sequencing (NGS) technologies have come a long way since 1977 when Frederick Sanger developed chain-termination sequencing, but are they ready for prime time in drug selection?

              Researchers have realized that cancer biology is driven by signaling pathways. Cells speak to each other and the messages they send are interpreted via intracellular pathways known as signal transduction. Many of these pathways are activated or deactivated by phosphorylations on select cellular proteins.

              Sequencing the genome of cancer cells is explicitly based upon the assumption that the pathways - network of genes - of tumor cells can be known in sufficient detail to control cancer. Each cancer cell can be different and the cancer cells that are present change and evolve with time.

              Although the theory behind inhibitor targeted therapy is appealing, the reality is more complex. Cancer cells often have many mutations in many different pathways, so even if one route is shut down by a targted treatment, the cancer cell may be able to use other routes.

              In other words, cancer cells have "backup systems" that allow them to survive. The result is that the drug does not affect the tumor as expected. The cancer state is typically characterized by a signaling process that is unregulated and in a continuous state of activation.

              In chemotherapy selection, genotype analysis (genomic profiling) examines a single process within the cell or a relatively small number of processes. All a gene mutation study can tell is whether or not the cells are potentially susceptible to a mechanism of attack. The aim is to tell if there is a theoretical predisposition to drug response.

              It doesn't tell you the effectiveness of one drug (or combination) or any other drug which may target this in the individual. There are many pathways to altered cellular function. Phenotype analysis (functional profiling) measures the end result of pathway activation or deactivation to predict whether patients will actually respond (clinical responders).

              It measures what happens at the end, rather than the status of the individual pathway, by assessing the activity of a drug (or combinations) upon combined effect of all cellular processes, using combined metabolic and morphologic endpoints, at the cell population level, measuring the interaction of the entire genome.

              Should oncologists begin using deep genome sequencing in their clinical practice? At the annual meeting of the European Society for Medical Oncology, two key opinion leaders battled it out over this topic in a debate.
              Gregory D. Pawelski

              Comment


                #8
                Debating Next-Generation Deep Sequencing

                At ESMO, experts assess the clinical use of genome sequencing

                Vienna—Should oncologists begin using deep genome sequencing in their clinical practice? Next-generation sequencing (NGS) technologies have come a long way since 1977 when Frederick Sanger developed chain-termination sequencing, but are they ready for prime time? At the annual meeting of the European Society for Medical Oncology, two key opinion leaders battled it out over this topic in a debate.

                The Argument for Deep Genome Sequencing

                Arguing the pro position, Fabrice Andre, MD, PhD, of the Institut Gustave Roussy in Villejuif, France, said that embracing deep sequencing in daily clinical practice is not only the right thing to do, it is a necessity. The number of genetic biomarkers known to influence patient outcomes and care has risen dramatically in recent years and is only expected to grow, he said.

                “The current system is not sustainable for hospitals and academic centers,” said Dr. Andre. “It’s not possible for [them] to run more than 10 bioassays per patient. We need to move to multiplex technology.”

                For breast cancer, he said, clinicians can run tests for ER/HER2, TOP2A, FGR1, IGFR1R, EGFR, PAK1, BRCA1, CYP2D6, PTEN and PI3KCA among others. With whole genome sequencing, “you can assess all the genes that you want,” said Dr. Andre. “When you do one test for each biomarker, each biomarker has a cost. Keep in mind that three FISH [fluorescence in situ hybridization] is equal to the same cost of one whole genome CGH [comparative genomic hybridization] array.”

                Whole genome sequencing also offers a number of other potential advantages. High throughput approaches can identify a large number of rare targetable gene alterations. This is increasingly important as researchers find genetic alterations that exist in 1% or 2% of patients. The technology also can capture minority clones that may be hard to identify when there is a low percentage of tumor cells in a sample. The next-generation sequencers have been proven to be accurate and they do not need large samples of tissue. Dr. Andre pointed out that some protein-based assays, which are used because they are less expensive than FISH, are not reliable. One study found that the immunohistochemistry test for the HER2 protein was accurate only 81.6% of the time (J Clin Oncol 2006;24:3032-3038, PMID: 16809727).

                The “robust” deep sequencing technology is already being used for patient care at academic centers. One such example is the MOSCATO trial, which began in fall 2011. This trial enrolled 120 patients with difficult-to-treat cancers and is using whole genome sequencing to identify potential therapeutic targets. Once a target has been identified, patients receive targeted therapy in a clinical trial if one is available. The turnover time for sequencing is 15 days and the total cost is 1,500 euros, or roughly $2,000 per patient.

                The cost of technology is expected to decrease dramatically in the next few years. By the end of 2012, Oxford Nanopore Technology is expected to launch a technology that is the size of a USB drive and will offer whole genome sequencing in 15 minutes for less than $1,000. Dr. Andre argued that deep sequencing will be less expensive than a multiplicity of tests.

                He pointed to a case study recently described in the Journal of Thoracic Oncology as an example of a success story (2012;7:e14-e16). In the case report, a 43-year-old never smoker with lung cancer had tested negative for EML4-ALK on the approved companion genetic test for crizotinib (Xalkori, Pfizer). Sensing that an oncogenic genetic driver was spurring the patient’s cancer, clinicians ordered deep sequencing and identified a novel ALK fusion. The patient was treated with crizotinib and was recently reported to have had a complete response.

                “In the context of prospective cohorts, but not clinical trials, I think we need to deliver NGS in order to detect a high number of rare, relevant genomic alterations and then treatment can be done in the context of Phase I trials or drug access programs,” said Dr. Andre.

                The Argument Against Deep Genome Sequencing

                According to Kenneth O’Byrne, MD, a consultant medical oncologist at St. James Hospital and Trinity College Dublin, Ireland, Dr. Andre is jumping the gun. “He makes the fundamental error that all people who are enthusiastic about new technologies always make and that is the non-application of evidence-based medicine,” Dr. O’Byrne said. “Deep sequencing is a fantastic tool, but it is a research toy and an expensive toy at the moment. For day-to-day practical medicine, we have to go by evidence base.”

                Dr. O’Byrne cast doubt on Dr. Andre’s success story example. “They treated the patient with crizotinib and made the false conclusion that the ALK rearrangement they detected was responsible for the response. Do we know if that patient expressed MET? Is there any other reason [he] may have responded to crizotinib?” Dr. O’Byrne said.

                He agreed that the cost of the sequencing technology was decreasing, but argued that analysis would remain expensive. He argued that the clinical benefit of identifying genetic drivers is still uncertain.

                “I would argue that in lung cancer, and indeed in almost every other tumor, there are only a few proven genetic alterations that can be identified that actually affect the way we treat our patients in clinic,” Dr. O’Byrne said. “EGFR [epidermal growth factor receptor] mutations and ALK rearrangements are the only validated predictive biomarkers in NSCLC [non-small cell lung cancer].” He pointed out that these affect only 15% of lung cancer patients, and although there are targeted agents available, the jury is still out on whether the drugs that target these mutations improve survival.

                As an example of this, he pointed out that an interim analysis of the PROFILE 007 trial presented at the ESMO meeting (abstract LBA1) showed that although crizotinib increased progression-free survival by 4.7 months compared with chemotherapy, there was no difference in overall survival. “If you look at all of the EGFR TKI [tyrosine kinase inhibitor] randomized controlled trials versus cytotoxic chemotherapy in EGFR mutation–positive disease, there has yet to be a proven [overall] survival benefit, despite obvious clinical benefits,” Dr. O’Byrne said. Researchers say the lack of overall survival advantage in many of these trials can be blamed on the large numbers of patients who cross over to the experimental therapy. “The argument is crossover, but we don’t know that yet,” he said.

                Dr. O’Byrne urged caution, as several years ago, it was thought that tumor angiogenesis inhibitors would be the salvation of lung cancer patients and that did not happen. There was clear evidence that new blood tumor vessels were associated with poor outcome, but when researchers tested a slew of antiangiogenic TKIs in patients with lung cancer, none of them worked. These included apatinib, axitinib, cedarinib, motesanib, pazopanib, sorafenib, sunitinib and vandetanib. “There is still some promise that some of these might break through,” Dr. O’Byrne said, pointing to Boehringer Ingelheim’s BIBF1120. “But to date, we’ve spent billions of euros proving that many of these are of no value.

                “In my view, and I feel this quite strongly, predictive biomarker tests must undergo validation and quality assurance before they are used rou- tinely in clinical practice,” Dr. O’Byrne said. “Deep DNA sequencing holds huge promise … but it is a research tool, and I do genuinely believe that a lot of clinically irrelevant data is generated that actually confuses the clinician and the patient.”

                Clinical Oncology News Issue: December 2012 | Volume: 07:12
                Gregory D. Pawelski

                Comment


                  #9
                  Genome-wide sequencing in cancer: not ready for prime time

                  (Reuters Health) - Routine genome-wide screening of cancers is likely a long way off, a new paper says.

                  The technology, known as next-generation sequencing, promises to revolutionize doctors' understanding of cancer and underpins perhaps the biggest paradigm shift taking place in cancer research today: the growing emphasis on a cancer's genetic makeup, rather than its location within the body.

                  Understanding the genetic makeup of an individual patient's tumor may allow physicians to pick the drug that best targets that specific tumor, as well as recognize when a tumor has developed resistance to the drug through new genetic mutations.

                  "Next-generation sequencing is especially promising in cancer because in a single test, one can interrogate all clinically relevant cancer genes for all types of genomic alterations, including sequence mutations and chromosomal rearrangements," Dr. Michael Berger, a geneticist at Memorial Sloan-Kettering Cancer Center in New York City, told Reuters Health in an email.

                  There are several different specific screening technologies that are considered "next-generation" - but all share the ability to sequence entire human genomes in a matter of days. When applied to cancer, the technology is used to screen the entire genome of cancer cells.

                  By some measures, this promise is already being realized. For example, last year The Cancer Genome Atlas Research Network used genome-wide screening of breast cancer tumors to demonstrate that there are four main breast cancer types defined by differing genomic and epigenetic mutations. The study showed that individual breast cancers have many genetic differences from each other but that one subgroup of breast cancers, basal-like breast cancer, was similar genetically to serous ovarian cancer.

                  Cancer cells present unique and complex challenges, they note. Because they are genetically so different from normal human tissue, there is not always a 'reference sequence' against which to compare the tumor DNA. There are also frequent chromosome-scale as well as epigenetic changes, and even significant genetic differences among cells within the same tumor, an issue specific to cancer cells known as tumor heterogeneity.

                  The authors of the new paper, writing online July 25 in the British Journal of Cancer, pointed out that this complexity creates a number of problems that must be solved before next-generation sequencing is a common part of cancer care.

                  One of the first issues is developing the algorithms that are used to map the genome.

                  "The computational challenges involved in analyzing and storing clinical (next-generation sequencing) data cannot be overstated," said Dr. Berger, who wasn't involved in the new study. "Better algorithms must be developed to reliably and accurately detect mutations in heterogeneous tumors."

                  In genome-wide sequencing, a seemingly minuscule misstep in the analysis could have massive consequences. For example, say the authors of the new paper, led by Dr. Danny Ulahannan of the Wellcome Trust Center for Human Genetics in Oxford, UK, "the sheer quantity of data means that getting 0.01% of the human genome wrong would correspond to 300,000 errors scattered along the three billion base pairs."

                  Dr. Lynda Chin, the chair of Genomic Medicine and scientific director of the Institute for Applied Cancer Science at MD Anderson Cancer Center, told Reuters Health this is often an overlooked problem.

                  "One barrier that is often overlooked or underestimated from the clinical side is the technical challenge of generating high-quality (next-generation sequencing) data," Dr. Chin said. "There is a sense that generating (the data) is easy, and it is the analysis that is hard. I would disagree, as I believe that the technology is still unstable, for lack of a better word, not yet turn-key, and no matter how good the analytics-interpretation become, if the data is poor quality, the result will be poor."

                  And mapping the genome is really only the first step. The next step is figuring out which mutations are relevant to the development of cancer and whether they can be targeted with a drug.

                  "I agree with the obvious barriers of interpretation. Not just analytically that we need improved algorithms, (but) more importantly, more knowledge and understanding of what each alteration means and how each event impact on clinical decision," Dr. Chin said.

                  The advances will require a "cultural change" in cancer research, Dr. Chin said, that makes "patient-oriented genomic research a standard, rather than a heroic effort by a researcher."
                  Gregory D. Pawelski

                  Comment


                    #10
                    First FDA Authorization for Next-Generation Sequencer

                    Francis S. Collins, M.D., Ph.D., and Margaret A. Hamburg, M.D.
                    N Engl J Med 2013; 369:2369-2371December 19, 2013DOI: 10.1056/NEJMp1314561

                    Article

                    This year marks 60 years since James Watson and Francis Crick described the structure of DNA and 10 years since the complete sequencing of the human genome. Fittingly, today the Food and Drug Administration (FDA) has granted marketing authorization for the first high-throughput (next-generation) genomic sequencer, Illumina's MiSeqDx, which will allow the development and use of innumerable new genome-based tests. When a global team of researchers sequenced that first human genome, it took more than a decade and cost hundreds of millions of dollars. Today, because of federal and private investment, sequencing technologies have advanced dramatically, and a human genome can be sequenced in about 24 hours for what is now less than $5,000. This is a rare example of technology development in which faster, cheaper, and better have coincided: as costs have plummeted and capacity has increased, the accuracy of sequencing has substantially improved. With the FDA's announcement, a platform that took nearly a decade to develop from an initial research project funded by the National Institutes of Health will be brought into use for clinical care. Clinicians can selectively look for an almost unlimited number of genetic changes that may be of medical significance. Access to these data opens the door for the transformation of research, clinical care, and patient engagement.

                    To see how this technology could be used, consider cancer. Comprehensive analysis of the genome sequence of individual cancers has helped uncover the specific mutations that contribute to the malignant phenotype, identify new targets for therapy, and increase the opportunities for choosing the optimal treatment for each patient. For instance, lung adenocarcinoma can now be divided into subtypes with unique genomic fingerprints associated with different outcomes and different responses to particular therapies. More broadly, recent work from the Cancer Genome Atlas demonstrates that the tissue of origin of a particular cancer may be much less relevant to prognosis and response to therapy than the array of causative mutations.1 As a result, patients diagnosed with a cancer for which there are few therapeutic options may increasingly benefit from drug therapies originally aimed at other cancers that share common driver mutations. The new technology allows us go from our current approach of targeted searches for specific mutations in individual cancers to widespread use of approaches that survey the entire genome.

                    A major area of opportunity that has yet to be fully exploited is pharmacogenomics — the use of genomic information to identify the right drug at the right dose for each patient. More than 120 FDA-approved drugs have pharmacogenomics information in their labeling, providing important details about differences in response to the drug and, in some cases, recommending genetic testing before prescribing.2

                    But the full potential of pharmacogenomics is largely unrealized, because of the logistic challenges in obtaining suitable genomic information in a timely enough fashion to guide prescribing. Placing genomic information in the electronic medical record would facilitate this kind of personalized medicine. If the patient's entire genome were part of his or her medical record, then the complexities of acquiring a DNA sample, shipping it, and performing laboratory work would be replaced by a quick electronic query.

                    Although this scenario holds great promise, the utility of genomic information for drug prescribing must be documented with rigorous evidence. For example, three recently published clinical trials raise questions about the clinical utility of using pharmacogenetic information in the initial dosing of vitamin K anatagonists.3

                    The FDA based its decision to grant marketing authorization for the Illumina instrument platform and reagents on their demonstrated accuracy across numerous genomic segments, spanning 19 human chromosomes. Precision and reproducibility across instruments, users, days, and reagent lots were also demonstrated.

                    The marketing authorization of a sequencing platform for clinical use will probably expand the incorporation of genetic information into health care. But even the most promising technologies cannot fully realize their potential if the relevant policy, legal, and regulatory issues are not adequately addressed. Already, key policy advances have helped smooth the way and address many of the public's concerns about the potential misuse of genetic information.4 For example, the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and the Genetic Information Nondiscrimination Act (GINA) prohibit health insurers from considering genetic information as a preexisting condition, as material to underwriting, or as the basis for denying coverage. GINA also protects against use of genetic information by employers. These protections do not, however, extend to the disease manifestations of genetic risks. Although genomic information showing a predisposition to cancer would be protected under GINA, other clinical signs or symptoms indicative of cancer are not protected. Provisions of the Affordable Care Act set to go into effect in 2014 go a step further and will preclude consideration of all preexisting conditions, whether genomic or not, in establishing insurance premiums. Current federal laws, however, do not restrict the use of genomic information in life insurance, long-term care insurance, or disability insurance.

                    The legal landscape for the use of genomics in personalized medicine grew brighter in June of this year when the Supreme Court ruled (in Association for Molecular Pathology v. Myriad Genetics) that isolated naturally occurring DNA cannot be patented. This decision was a breakthrough for access to individual genetic tests but also, even more important, for the integration of genome sequencing into clinical care. Before the Myriad decision, there were substantial concerns that in order to offer whole genome sequencing, clinical laboratories would have to pay royalties to a long list of gene patent holders. The decision has opened the creative doors to an as yet unimaginable set of products that may benefit the public health.

                    The FDA has also been active in addressing other regulatory issues surrounding personalized medicine.5 Along with authorizing the Illumina technology for marketing, the FDA recognized the need for reference materials and methods that would permit performance assessment. As a result, the FDA collaborated with the National Institute for Standards and Technology (NIST) to develop reference materials consisting of whole human genome DNA, together with the best possible sequence interpretation of such genomes. The first human genome reference materials are expected to be available for public use in the next 12 months.

                    This marketing authorization of a non–disease-specific platform will allow any lab to test any sequence for any purpose. Thus, putting in place an appropriate risk-based regulatory framework is now critical to ensure the validation and quality of tests (called laboratory-developed tests, or LDTs) developed in-house by clinical laboratories.

                    The marketing authorization for the first next-generation genome sequencer represents a significant step forward in the ability to generate genomic information that will ultimately improve patient care. Yet it is only one step. There are many challenges ahead before personalized medicine can be considered truly embedded in health care. We need to continue to uncover variants within the genome that can be used to predict disease onset, affect progression, and modulate drug response. New genomic findings need to be validated before they can be integrated into medical decision making. Doctors and other health care professionals will need support in interpreting genomic data and their meaning for individual patients. Patients will want to be able to talk about their genetic information with their doctor. With the right information and support, patients will be able to participate alongside their doctors in making more informed decisions. Reimbursement issues need to be resolved to assure that patients have access to the best tests and that manufacturers have incentives to develop them.

                    The arrival of next-generation sequencing at this regulatory landmark is only the beginning. We need to work together to ensure that research progresses, that regulatory policies are developed, that patients' rights and needs are addressed, and that clinical use of genomic information is based on rigorous evidence.

                    References:

                    1. Kandoth C, McLellan MD, Vandin F, et al. Mutational landscape and significance across 12 major cancer types. Nature 2013;502:333-339

                    2. Table of pharmacogenomic biomarkers in drug labeling. Silver Spring, MD: Food and Drug Administration, 2013 Only registered and activated users can see links., Click Here To Register...)

                    3. Furie B. Do pharmacogenetics have a role in the dosing of vitamin K antagonists? N Engl J Med 2013;369:2345-2346

                    4. Hudson KL. Genomics, health care, and society. N Engl J Med 2011;365:1033-1041

                    5. Paving the way for personalized medicine: FDA's role in a new era of medical product development. Silver Spring, MD: Food and Drug Administration, October 2013 Only registered and activated users can see links., Click Here To Register...)

                    Only registered and activated users can see links., Click Here To Register...
                    Gregory D. Pawelski

                    Comment


                      #11
                      Pharmacogenomics can be defined as the study of how a person’s genetic makeup determines response to a drug. Although any number of labs and techniques can detect mutant genes, this area of pharmacogenomics was ripe for proprietary tests, invented alongside the drug and owned by the drug developer and/or a partner in the diagnostics field.

                      This business opportunity evolved as more drugs were approved with companion diagnostics. Unfortunately, the introduction of these new drugs has not been accompanied by specific predictive tests allowing for a rational and economical use of the drugs.

                      Companion diagnostics and their companion therapies are what's being pushed as "personalized medicine" as they enable the identification of likely responders to therapies that work in patients with a specific molecular profile. However, companion diagnostics tend to only answer a targeted drug-specific question and may not address other important clinical decision needs.

                      These companion diagnostics are being used to predict responsiveness and determine candidacy for a particular therapy often included in drug labels as either required or recommended testing prior to therapy initiation. I certainly would not want to be "denied" treatment because of gene testing. Gene testing is not a clear predictor of a lack of benefit from particular targeted therapies.

                      Anyone familiar with cellular biology knows that having the genetic sequence of a known gene (genotype) does not equate to having the disease state (phenotype) represented by that gene. It requires specific cellular triggers and specialized cellular mechanisms to literally translate the code into the work horse of the cellular world - proteins.
                      Gregory D. Pawelski

                      Comment


                        #12
                        Scientists challenge the genetic interpretation of biology

                        A proposal for reformulating the foundations of biology, based on the 2nd law of thermodynamics and which is in sharp contrast to the prevailing genetic view, is published in the Journal of the Royal Society Interface under the title "Genes without prominence: a reappraisal of the foundations of biology". The authors, Arto Annila, Professor of physics at Helsinki University and Keith Baverstock, Docent and former professor at the University of Eastern Finland, assert that the prominent emphasis currently given to the gene in biology is based on a flawed interpretation of experimental genetics and should be replaced by more fundamental considerations of how the cell utilises energy. There are far-reaching implications, both in research and for the current strategy in many countries to develop personalised medicine based on genome-wide sequencing.

                        Is it in your genes?

                        By "it" we mean intelligence, sexual orientation, increased risk of cancer, stroke or heart attack, criminal behaviour, political preference and religious beliefs, etcetera. Genes have been implicated in influencing, wholly or partly, all these aspects of our lives by researchers. Genes cannot cause any of these features, although geneticists have found associations between specific genes and all of these features, many of which are entirely spurious and a few are fortuitous.

                        How can we be so sure?

                        When a gene, a string of bases on the DNA molecule, is deployed, it is first transcribed and then translated into a peptide - a string of amino acids. To give rise to biological properties it needs to "fold" into a protein.

                        This process consumes energy and is therefore governed by the 2nd law, but also by the environment in which the folding takes place. These two factors mean that there is no causal relationship between the original gene coding sequence and the biological activity of the protein.

                        Is there any empirical evidence to support this?

                        Yes, a Nordic study of twins conducted in 2000 showed there was no evidence that cancer was a "genetic" disease - that is - that genes play no role in the causation of cancer. A wider international study involving 50,000 identical twin pairs published in 2012, showed that this conclusion applied to other common disease as well. Since the sequencing of the human genome was completed in 2001 it has not proved possible to relate abnormal gene sequences to common diseases giving rise to the problem of the "missing heritability".

                        What is the essence of the reformulation?

                        At the most fundamental level organisms are energy-consuming systems and the appropriate foundation in physics is that of complex dissipative systems. As energy flows in and out and within, the complex molecular system called the cell, fundamental physical considerations, dictated by the 2nd law of thermodynamics, demand that these flows, called actions, are maximally efficient (follow the path of least resistance) in space and time. Energy exchanges can give rise to new emergent properties that modify the actions and give rise to more new emergent properties and so on. The result is evolution from simpler to more complex and diverse organisms in both form and function, without the need to invoke genes. This model is supported by earlier computer simulations to create a virtual ecosystem by Mauno Rönkkö of the University of Eastern Finland.

                        What implications does this have in practice?

                        There are many, but two are urgent.

                        1. to assume that genes are unavoidable influences on our health and behaviour will distract attention from the real causes of disease, many of which arise from our environment;

                        2. the current strategy towards basing healthcare on genome-wide sequencing, so called "personalised healthcare", will prove costly and ineffective.

                        What is personalised health care?

                        This is the idea that it will be possible to predict at birth, by determining the total DNA sequence (genome-wide sequence), health outcomes in the future and take preventive measures. Most European countries have research programmes in this and in the UK a pilot study with 100,000 participants is underway.

                        Reference: University of Eastern Finland

                        Citation: "Scientists challenge the genetic interpretation of biology." Medical News Today. MediLexicon, Intl., 21 Feb. 2014.
                        Gregory D. Pawelski

                        Comment


                          #13
                          Is It Ethical to Deny Cancer Patients Functional Analyses?

                          Robert A. Nagourney, M.D.

                          The ethical standards that govern human experimentation have become an important topic of discussion. Clinical trials are conducted to resolve medical questions while protecting the rights and well-being of the participants. Human subject committees known as Institutional Review Boards (IRB’s) not only confront questions of protocol design and patient protection but also the appropriateness of the questions to be answered. The Belmont Report (1979) defined three fundamental principles i) respect for persons, ii) beneficence and iii) justice. These have been incorporated into regulatory guidelines codified in the code of federal regulations like 45 CFR 46.111. One historical experience offers an interesting perspective upon contemporary oncologic practice.

                          Only registered and activated users can see links., Click Here To Register...

                          With advances in cardiac surgery in the1970s and 1980s, in both valvular and coronary artery bypass, an alarming amount of post-operative bleeding was being observed. To address this complication an enzyme inhibitor named Aprotinin was developed by Bayer pharmaceuticals. The drug works by preventing the body from breaking down blood clots (thrombolysis). This is critical for the prevention of postoperative bleeding. Concerns regarding its safety led to Aprotinin’s temporary withdrawal from the market, but those have been resolved and the drug is again available.

                          After Aprotinin’s introduction, clinical trials were conducted to test its efficacy. Initial results were highly favorable as the drug consistently reduced post-op bleeding. By December 1991, 455 patients had been evaluated providing strong statistical evidence that Aprotinin reduced bleeding by more than 70 percent. Despite this, trialists continued to accrue patients to Aprotinin versus “no treatment” studies. By December 1992, more than 2,000 patients had been accrued and by October of 1994, the number had increased to more than 3,800 patients. Yet the 75 percent risk reduction remained entirely unchanged. Thus, 3,400 patients at untold cost and hardship were subjected to the risk of bleeding to address a question that had long since been resolved.

                          In a 2005 analysis, Dean Fergusson et al, decried that it should have been evident to anyone who cared to review the literature that Aprotinin’s efficacy had been established. Further accrual to clinical trials beyond 1991 only exposed patients to unwarranted risk of bleeding, and had no possible chance of further establishing the clinical utility of the intervention. This stands as a striking lack of consideration for patient well-being. Fergusson’s review raises further questions about the ethics of conducting studies to prove already proven points. With this as a backdrop, it is instructive to examine functional profiling for the prediction of response to chemotherapy.

                          Beginning in 1997, a cumulative meta-analysis of 34 clinical trials (1,280 patients), which correlated drug response with clinical outcome was reported. Drug sensitive patients had a significantly higher objective response rate of 81 percent over the response rate of 13 percent for those found drug resistant (P < 0.0000001).

                          Only registered and activated users can see links., Click Here To Register...

                          This was met by the ASCO/Blue Cross-Blue Shield Technology Assessment published in Journal of Clinical Oncology (Schrag, D et al J Clin Oncol, 2004) that cried for further clinical trials. A subsequent meta-analysis correlated the outcome of 1929 patients with leukemia and lymphoma against laboratory results and again showed significantly superior outcomes for assay directed therapy (P <0.001) (Bosanquet AG, Proc. Amer Soc Hematology, 2007).

                          Only registered and activated users can see links., Click Here To Register...

                          In response, a second ASCO Guideline paper was published in 2011. (Burstein H et al J Clin Oncol, 2011) Although the authors were forced to concede the importance of the field, they concluded that “participation in clinical trials evaluating these technologies remains a priority.”

                          Only registered and activated users can see links., Click Here To Register...

                          Most recently we conducted a cumulative meta-analysis of 2581 treated patients that established that patients who receive laboratory “sensitive” drugs are 2.04 fold more likely to respond (p < 0.001) and 1.4 fold more likely to survive one year or more (p <0.02) (Apfel C. Proc Am Soc Clin Oncol 2013).

                          Only registered and activated users can see links., Click Here To Register...

                          Each successive meta-analysis has concluded, beyond a shadow of a doubt, that human tumor functional analyses (e.g. EVA-PCD) identify effective drugs and eliminate ineffective drugs better than any other tool at the disposal of cancer physicians today. Not unlike those investigators who continued to accrue patients to trials testing Aprotinin, long after the result were in, oncologists today continue to clamor for trials to prove something which, to the dispassionate observer, is already patently obvious. If we now pose the question “Is it ethical to deny patients functional analyses to select chemotherapy?” the answer is a resounding No!
                          Gregory D. Pawelski

                          Comment

                          Working...
                          X