Tweets preprocessing: A fully connected neural network and a Naive Bayes classifier were trained on a collection of tweets to classify them as positive or negative. The neural network demonstrated significantly better performance.
A few tags you might expect to see entity recognition task are: geo: geographical entity org: organization per: person gpe: geopolitical entity tim: time indicator art: artifact eve: event nat: natural phenomenon O: filler word
For sequence generator model, a single-layer LSTM model was trained on one of Shakespeare's narrative poems, "A Lover's Complaint." When provided with a segment of text from the poem, the model can predict the subsequent letters and words.
An English-to-Portuguese neural machine translation model was built using Long Short-Term Memory (LSTM) networks with attention. You can replace the corpus of text with any desired paired languages.