Skip to content

Different sequence and seq-to-seq models are built in this repository. One aim is to fit these models to biomedical OMICs data for the purpose of nucleotide / amino acid / token prediction, given previous sequences. Another aim is to infer gene regulatory network and cell communication by considering each single cell or gene as a token.

Notifications You must be signed in to change notification settings

mehranpiran/NLP-models

Repository files navigation

Tweets preprocessing: A fully connected neural network and a Naive Bayes classifier were trained on a collection of tweets to classify them as positive or negative. The neural network demonstrated significantly better performance.

A few tags you might expect to see entity recognition task are: geo: geographical entity org: organization per: person gpe: geopolitical entity tim: time indicator art: artifact eve: event nat: natural phenomenon O: filler word

For sequence generator model, a single-layer LSTM model was trained on one of Shakespeare's narrative poems, "A Lover's Complaint." When provided with a segment of text from the poem, the model can predict the subsequent letters and words.

An English-to-Portuguese neural machine translation model was built using Long Short-Term Memory (LSTM) networks with attention. You can replace the corpus of text with any desired paired languages.

About

Different sequence and seq-to-seq models are built in this repository. One aim is to fit these models to biomedical OMICs data for the purpose of nucleotide / amino acid / token prediction, given previous sequences. Another aim is to infer gene regulatory network and cell communication by considering each single cell or gene as a token.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published