Date post: | 06-Aug-2015 |
Category: |
Technology |
Upload: | nihar-suryawanshi |
View: | 42 times |
Download: | 3 times |
1
Machine Learning And
Nihar N SuryawanshiI.T Grad at University of Pune
2Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
Contents:1. What is ML2. Requirements3. Components of ML4. Supervised VS Unsupervised5. Classification VS Regression6. Naïve Bayes7. SVM8. Maximum Entropy9. Lexicon and Classifier10.Comparison11.Conclusion12.References
3
What is Machine Learning..?• Machine learning is a type of artificial intelligence (AI) that provides computers with the ability to learn without being explicitly programmed.
• The Machine that Teaches Themselves.
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
4
Things require for ML•Data•Pattern•Mathematical Representation
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
5
Components Of ML
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
6
Types of learning• Supervised Learning: In this type we provide essential information to The machine. Input and Output Data sets are provided
•Unsupervised Learning: In this type not much info is provided and machine gives results using tedious calculations.
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
7Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
8
Classification Vs Regression•Classification means to group the output into a class.•In Classification the output value is small and discrete. Ex: tumor->yes or no.
•Regression means to predict the output value using training data.(gives more detailed and approximate output).•In Regression the output is continuous. Ex: tumor ->harmful or not harmful.
9
Techniques in ML• Naïve Bays• Support Vector Machines• Maximum Entropy
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
10
Naïve Bayes•Based on Bayesian theorem •Bays theorem:
P(c | d) = P(c) P(d | c) P(d)
c= event of Rainingd=event of Dark clouds
•We make assumption that Events are conditionally independent
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
11
P(Y)=5/8=0.625 P(N)=3/8=0.375
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
12Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
P(Chills=yes and flue =yes)= 3/5= 0.6
13Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
14
Support Vector Machines•Subject is divided into through Hyper plane which formsbasis of classification•Designed by Vampik•Linear Classification
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
15
Maximum Entropy•Maximum Entropy is a Probability distribution estimationTechnique..
•The principal of Entropy is that without external knowledgeone should Prefer distribution that are uniform
•Here in probability events are Dependent
16
Combining Lexicon + Classifiers• To increase the efficiency we can combine traditional Lexicon based systems with Modern Classifier machines like Naïve Bayes or SVM.
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
17
Naïve Bays SVM Maximum Entropy
Easy to Implement Harder to Implement
Harder to Implement
Less Efficient,Efficient due to working with large sets of Words
Efficiency is maximum
Efficiency is moderate
Limited Use Versatile Used in Comp Vision, Text Cat, IP
Hardly used
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
Comparison
18
Observations :
Ref: [1] Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
19
Conclusion•The machine learning can prove efficient over traditional techniques for SA
•The Naïve Bayes can be useful in sentiment analysis of text categorization.
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
20
References[1]Thumbsup?Sentiment Classificationusing Machine Learning Techniques. BoPang and LillianLee,Shivakumar Vaithyanathan[IBM, Cornell University].
[2] Machine Learning Algorithms for Opinion Mining and Sentiment Classification Jayashri Khairnar,Mayura Kinikar[IJSRP].
[3] An introduction to Machine Learning Pierre Geurts[Department of EE and CS & Bioinformatics, University of Liège].
[4] A Tutorial on Naive Bayes Classification[Carnegie Mellon University ]
[5]Using Maximum Entropy for Text Classification[Carnegie Mellon University].
[6]combining Lexicon and leaning.[Andrius Mudinas][Dell Zhang]
[7] Wikipedia and Internet.
Sinhgad Academy Of Engineering, Pune DEPERTMENT OF INFORMATION TECHNOLOGY
21
Thank You
Question ?-Nihar Suryawanshi.
Sinhgad Academy Of Engineering, PuneDEPERTMENT OF INFORMATION TECHNOLOGY