Analytics

turn on suggestions

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

2 weeks ago

2 weeks ago

Greetings to all the data scientists / Aster fans from the forum,

I am trying to convert a Healthcare Knowledge Model derived from anonymized electronic patient records that was created though nosiy-or Bayes modelling to a similar model that could be used by one of the Aster analytic functions (since the original comes from text processing I thought at a first glance to use NaiveBayesTextPrediction....but things are not so similar). The model is expressed as a graph with nodes, edges and weights at https://github.com/clinicalml/HealthKnowledgeGraph.

Compared to Aster Naive Bayes Text Prediction based on a Bernoulli model this one is a little bit different since the NBTP requires prior probabilities and the weights are in a total different range (for those who have queryed a model table from Naive Bayes Text Prediction there is an additional line that described the prior probability for the class on top of the tokens probabilities). The noisy-or matrix I am struggling with looks like:

Diseases Symptoms

abscess | pain (0.318), fever (0.119), swelling (0.112), redness (0.094), chills (0.092) |

Where "Diseases" is the class and "Symptoms" is an tokens-probability array. From what I read about the nosisy-or the sum of the tokens probabilities should be always equal to 1. I am wondering if:

1) Such a model could be "tweaked" (maybe by converting the weights to "NBTP-like" probabilities?) into a NBTP model table?

2) There is another analytic function suitable for text processing that could make use of such model (maybe a graph function...)?

The NBTP from Aster looks like this:

Token Class Probability

Obstruction | i00-i99 | 2.66595574513463E-4 |

ASTER_NAIVE_BAYES_PRIOR_PROB | i00-i99 | 1.18632976359236E-4 |

Thanks a lot!