Skip to content

irini-git/projects

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

65 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Natural Language Processing

NLP 01 Bigram BeRP
Bigram model to predict the conditional probability using data from The Berkeley Restaurant Project (BeRP).

NLP 01 Bigram mini corpus
Bigram model to predict the conditional probability of the next word for French nurse.

NLP 01 Good-Turing Smoothing
Good-Turing smoothing for catch of the day.

NLP 01 Kneser-Ney Smoothing
Kneser-Ney Smoothing for Appeal by General de Gaulle.

NLP 01 Minimum Edit distance
Minimum Edit distance : from manager to leader.

NLP 01 SMS Spam Classifier
Spam or not spam : classify SMS messages using Naive Bayes.

NLP 01 Type Token Ratio
Lexical variety within a text : calculate type-token ratio.

NLP 02 Naive Bayes
Text Classification with Naive Bayes and Binary NB.

NLP_02_Naive_Bayes_negation Text Classification with Naive Bayes and Binary NB : negation in samples.

NLP 02 Spelling Correction Non-Word
Actress or across: Laplace (add one smoothing) for STM articles and speeches.

NLP 02 Spelling Correction Real Word
Laplace (add one smoothing) for STM articles and X (twitter).

Naming convention

Code : NLP for Natural Language Processing
01 : difficulty of the concept
Desctiption : high level description

License

Distributed under the MIT License. See LICENSE for more information.