Skip to content

Commit

Permalink
polish remarks
Browse files Browse the repository at this point in the history
  • Loading branch information
jieyima committed Mar 14, 2018
1 parent fe882c9 commit e2ff06b
Showing 1 changed file with 13 additions and 5 deletions.
18 changes: 13 additions & 5 deletions NLP_AWS_SouthPark.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"In this notebook, I used Amazon comprehend to implement sentiment analysis on one review of the episode of South Park from Dani Di Placido Forbes **#OnTV** section. "
"In this notebook, I used Amazon comprehend to implement sentiment analysis on one review of the episode of South Park from Dani Di Placido Forbes **#OnTV** section (published Nov. 9, 2017). "
]
},
{
Expand Down Expand Up @@ -160,12 +160,13 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"Seems that the review for Garrison Trump in this episode is mostly neative (a dominant proportion of 53%), and somewhat netural (41%). \n",
"**Seems that the review for Garrison Trump in this episode is mostly negative (a dominant proportion of 53%), and somewhat neutral (41%).**\n",
"\n",
"The result is not surprising, as in this episode President Garrison keeps on his campaign promise to \"fxxk them all to death\", and threaten those who against him. "
"**The result is not surprising, as in this episode President Garrison keeps on his campaign promise to \"fxxk them all to death\", and threaten those who against him.** "
]
},
{
Expand All @@ -192,7 +193,7 @@
},
"outputs": [],
"source": [
"path = \"/Users/Jessica/Desktop/BAX452/Natural_Language_Processing/SouthParkReview.txt\"\n",
"path = \"~/Natural_Language_Processing/SouthParkReview.txt\"\n",
"doc1 = open(path, \"r\")\n",
"output = doc1.readlines()"
]
Expand All @@ -201,7 +202,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
" -- Note: there is a text size limit for Detectsemtiment analysis in AWS of 5000 bytes. So I have to manually cut down the text file and make the size fit. "
" Author Note: there is a text size limit for Detectsemtiment analysis in AWS of 5000 bytes. So I have to manually cut down the text file and make the size fit."
]
},
{
Expand Down Expand Up @@ -261,6 +262,13 @@
"print(json.dumps(comprehend.detect_sentiment(Text=output[1], LanguageCode='en'), sort_keys=True, indent=4))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**The review for this episode is neutral (a dominant proportion of 69%) to positive (30%). Key word 'favorite' contributes mostly to the positive sentiment.**"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down

0 comments on commit e2ff06b

Please sign in to comment.