Month: June 2016

Value Added Partners Invited –; Cognitive Computing Platform democratizing Medical Knowledge at Point of Care.

Screenshot 2016-06-24 10.59.09

Commoditization of Data Science and unleashing Democratized Medical Knowledge.

The mission of Ingine Inc as a startup is to bring advancement in data science as applicable to medical knowledge extraction from large data sets.

Screenshot 2016-06-24 11.29.39

Particularly following are the differentiators owing to which Ingine Inc is a candidate startup in hope of advancing science in difficult to solve areas; driven by decades of research by Dr. Barry Robson.

  1. Introducing Hyperbolic Dirac Net (HDN); a machinery created borrowing from Quantum Mechanics to advance data mining and deep learning beyond what Bayesian could deliver; against the backdrop of very large data sets riddled with uncertainty and high-dimentionality. Most importantly, HDN based non-hypothesis approach allows us to create a learning system workbench that is also amenable to research and discovery related efforts based on deep learning techniques.
  2. Create large data driven evidence based medicine (EBM). This means creating scientifically curated medical knowledge having gone through a process akin to systematic review.
  3. Integrate Patient centric studies with epidemiological studies to achieve a comprehensive framework to advance integrated large data driven bio-statistical approach which addresses both systemic and also functional concerns. This means blending both descriptive and inferential (HDN) statistical approaches.
  4. Introduce a comprehensive notational and symbolic programming framework that allows us to create a unified mathematical framework to deliver both probabilistic and deterministic methods of reasoning which allows us to create varieties of cognitive experience from large sets of data riddled with uncertainty.
  5. Use all of the above in creating a Point of Care platform experience that delivers EBM in a PICO format as followed by the industry as a gold standard.

While PICO is employed as a framework to create EBM driven diagnosis process as a consequence of both qualitative and quantitative methods that better achieves systematic review; medical exam setting is used as a specification to define the template for enacting the EBM process. This is based on the caveat that for a system to qualify as an expert system in the medical area, it should also be able to pass medical exams based on the knowledge the learning system has acquired that is scientifically curated by both automated machine learning and manual intervention efforts.

As part of the overall architecture, that employs some ingenious design techniques such as non-predicated, non -hypothesis driven and schema-less design; semantic lake a tag driven knowledge repository is created from which the cognitive experience is created employing inferential statistics. Furthermore the capability can be delivered as a cloud computing platform where parallelization, in-memmory processing, high performance computing (HPC) and elastic scaling are addressed.


Precision Medicine: With new program from White House; also comes redundant grant funding and waste – How does all these escape in high science areas?


Recently announced Precision Medicine a fantastic mission to bring all the research institutions country wide to collaborate together and holistically solve the civilization’s most complex and pressing problem Cancer, employing genomics while engaging science in an integrative discipline approach.

While the Precision Medicine mission is grand and certainly requires much attention and focus; that many new tools are now available for medical research such as complex algorithms in the areas of cognitive science (data mining, deep learning, etc), bigdata processing, cloud computing, etc; we also need efforts to arrest redundant spend and grants.  

Speaking of precision medicine such waste what an irony.

The White House Hosts a Precision Medicine Initiative Summit

Grand Initiative Redundant Research Grants for Same Methods

$1,399,997 :- Study Description: We propose to develop Bayesian double-robust causal inference methods that are accurate, vigorous, and efficient for evaluating the clinical effectiveness of ATSs, utilizing electronic health records and registry studies, through working closely with our stakeholder advisory panel. The proposed “PCATS” R package will allow easy application of our methods without requiring R programming skills. We will assess clinical effectiveness of the expert-recommended ATSs for the pJIA patient population using a multicenter new-patient registry study design. The study outcomes are clinical responses and the health-related quality of life after a year of treatment.

$832,703 :- Bayesian statistical approach in contrary try to use present as well as historical trial data in a combined framework and can provide better precision for CER. Bayesian methods also flexible in capturing subjecting prior opinion about multiple treatment options and tend to be robust. Despite these advantages, the Bayesian method for CER is underused and underdeveloped (see PCORI Methodology Report, pg. 64, 2013). The primary reasons being a lack of understanding about the role, the lack of methodological development, and the unavailability of easy-to-use software to design and conduct such analysis.

$839,943 :- We propose to use a method of analysis called Bayes method, in which data on the frequency of a disease in a population is combined with data taken from an individual patient (for example, the result of a diagnostic test) to calculate the chance that the patient has the disease given his or her test result. Clinicians currently use Bayes method when screening patients for disease, but we believe the utility of this methodology extends far beyond its current use.

$535,277 Specific Aims:

  1. To encourage Bayesian analysis of HTE:
  • To develop recommendations on how to study HTE using Bayesian statistical models
  • To develop a user-friendly, free, validated software for Bayesian methods for HTE analysis

2. To develop recommendations about the choice of treatment effect scale for the assessment of HTE in PCOR. The main products of this study will be:

  • recommendations or guidance on how to do Bayesian analysis of HTE in PCOR
  • software to do the Bayesian methods
  • recommendations or guidance on choosing appropriate treatment effect scale for HTE analysis in PCOR, and
  • demonstration of our products using data from large comparative effectiveness trials.