Using NLP to build a sarcasm classifier
Pick two or three news sources and select a few news titles from their feed (about 5 is likely enough). For example you could select CNN, Fox News, MSNBC, NPR, PBS, Al Jazeera, RT (Russia Today), Deutsche Welle, Facebook, BBC, France24, CCTV, NHK World or another source you wish you analyze. Run your sarcasm model to predict whether the titles are interpreted as sarcastic or not. Analyze the results and comment on the different news sources you have selected.
The numbers are probability of being sarcastic.
Fox
Nevada sheriff tells library not to call 911 if they support BLM - 0.99998355
The Onion
Fisher-Price Announces Company Has Grown Out Of Making Stupid Toys For Babies-0.99990344
The Best Cities To Live In For Fans Of Rock And Roll Museums And The Cleveland Browns - 0.0001301
NPR
TikTok CEO: Facebook Is Making ‘CopyCat’ Service ‘Disguised As Patriotism’ - 0.00624101
U.K. ‘Actively Avoided’ Investigating Russian Interference, Lawmakers Find- 0.9481692
I chose my new sources to have a mix of supposed bias. Fox like CNN, is known to be a partisan news source. NPR is used for contrast. It claims to be unpartisan. The Onion I chose because all of its headlines are sarcastic or ironic, as it isn’t a true news platform. I chose the articles from NPR as both contained quotations, to see if it would identify that as sarcastic.
I have no confidence in this sarcasm detector’s performance on real headline’s. It only had two successful predictions out of five. It doesn’t seem to view quotation marks sarcastic.
Text generation with an RNN
Use the generate_text() command at the end of the exercise to produce synthetic output from your RNN model. Run it a second time and review the output. How has your RNN model been able to “learn” and “remember” the shakespeare text in order to reproduce a similar output?
ROMEO: fetch my troth, and a true underately close the Duke of Jusining, And in my kindred be might each other’d shret.
NORTHUMBERLAND: Richard! where’s she such sorry; and of all The marriage of her for a mile-hoop, And so too rege! Have done, ‘twas in a hollobles.
First Murderer: Marry, sir, that he will sleep ng Kateful in: Yet not Henry’s foneard so, and I’ll hold in people; if my mistress canonect, Behelded with like like the pleasure of our good anoth, One of the people and the landuage kill’z In many o’ the breath of an; No more shall be assured in grievous children, Who, bid me very we mercy to another And: but their lain of mine. Thy father gried to London a Roman; make me treadou
Te take out of holy father But of the time to cote of trabiners.
GLOUCESTER: But come hither Warwick’s beastly?
PROSPERO: found?
First Gettinn combute: And will not fool, Norfolk, I pray thee, you may keeps forging.
POLIXENES: Good my lord.
LORD IO: Three heart is but the sleep-tiest, leg himself. G
Second run
Juliet: not, here to use war He have full of thine own report; and yields will forgot Full–have surp yourself. Aptle Citizen: Come on, my lord, to shame you.
POLIXENES: O, do not she comes. Now God’s walling distial hands: Thou hast with all the feerinant Anm,–but much desprizeness to you; and, for the battle’s now, Whe might have not, sleep not: therefore be action’d short.
KING RICHARD III: Thou hast neat–Was last; for all the thing pilme; Abreit of my name indeed. I shall go find thee, sir, we may in love Is not him what I will speak to crown’d his, This cruel whee little on your power and joar Than with the Tirtue’s enemy.
AEdile: Is strange is sin: Destroy’d yiel him twal, so much grief to the favour, as it Look Wherein to answer to uncient: let him have.
JOHN OF GAUNT: Have I not much tribune: For worbs both thy behalf Speak: away; for thy lips, Proud with ton of sullen kindless is to kill him,’ as he made, and in the bosom.
CLARENCE: Al, my lord: I be quite wonds; And with the l
The RNN seems to be taking the general format well, starting with a character name of sorts before every line. However, it has no sense of context and just looks at the word beforehand to try to predict the next word, instead of the entire sentence. On occasion it spits out entire phrases it has learned from the text.
Stretch goal - replace the Shakespeare text with your own selected text and run the model again. Modify the source text as needed in order to generate_text() from the newly trained model.
Here’s some generated text from the Bee movie. It’s mostly just random lines from the movie with slight changes or gibberish. This model was definitely overfit. It had too little source text and too many epochs.
Accordingt to handy!
Wel.
Bear Week next week! They’re scary, hairy and here live.
Always leans forward, pointy shoulders, squinty eyes, very Jewish.
In tennis, you attack at the point of smellive.
Like what whalk it know.
Thet hat go do. Ley nextal wervies down.
Laht it! It’s your flowas have neg before bee bay?
Affirmall.
Well, I get a chill.
Well, I guess I’ll out lens you.
Sur Nack, here.
My sweater is Rappe andonian Band be
So, Mr. Sting, thank you for being help. Ded up the somes.
But on thought I can’t for you…
Oh, my.
Oome on! You got to think bee, Barry.
- Thinking bee.
- Thinking bee.
Thinking bee! Thinking bee! Thinking bee!
Handyould he talk corod f ro are smotion naweres
it if you gut it from!
The him out of my shere,
Hold it, som.
Bran my the fan you ever been sturn Gad id.
What dees got a little bit of magic.
That’s amazing. Why do we do that?
This is the coolest. What is it?
Neural machine translation with attention
Use the translate() command at the end of the exercise to translate three sentences from Spanish to English. How did your translations turn out?
The translations did pretty well. This first translation was correct.
Input:
This was close, it understood that it was a question about being at home. You should probably be they in this sentence.
Input:
This last translation got the right types of words, but it comes out with the wrong idea. It should translate to “try to find out”. Which can have a different meaning from try to figure it out.
Input: