Yes and "tokenized" should be given as a argument to function.
When you test out stuff is best to remove try:except,so then it look like this.
When you test out stuff is best to remove try:except,so then it look like this.
import nltk from nltk import pos_tag, PunktSentenceTokenizer from nltk.corpus import state_union def process_content(tokenized): for i in tokenized: words = nltk.word_tokenize(i) tagged = pos_tag(words) print(tagged) if __name__ == '__main__': train_text = state_union.raw("2005-GWBush.txt") sample_text = state_union.raw("2006-GWBush.txt") custom_sent_tokenizer = PunktSentenceTokenizer(train_text) tokenized = custom_sent_tokenizer.tokenize(sample_text) process_content(tokenized)