# Jegyzetek

Intelligent data analysis. Lecture notes and other literatures, 2016

S. Haykin: : Neural networks, Prentice Hall 1997.

Bishop, Ch: Pattern recognition and machine learning, Springer, 2006.

Hastie-Tibshirani-Friedman: The Elements of Statistical Learning. Data mining, inference and prediction, second edition, Springer, 2009.

Gareth James•Daniela Witten•Trevor Hastie, Robert Tibshirani An Introduction to Statistical Learning, corrected 4th printing, with Applications in R, Springer, 2014.

Useful additional material can be found: http://www.r-bloggers.com/in-depth-introduction-to-machine-learning-in-15-hours-of-expert-videos/

For the first part of the course you can find related slides in http://education.ieee-cis.org/lectures/Conference-Tutorials/Neural-Networks-for-System-Modeling. The style is different, but the content is partly related.

Detailed topics of the lectures:

I. part (G. Horváth)

Lecture 1. (5. sept.) Introduction.  Requirements of the subject, Home work.

Lecture 2.  (9. sept. ) A few data analysis examples. Industrial examples. General questions of data analysis. Collect data, clean data, etc. The problem of few data very large set of data, erronous data, distorted data, missing data, etc. The dimensionality. Curse of dimensionality. Characterization of data: data distribution (density function):  data as random variables, statistical properties, main statistical parameters (expected value, variance, covariance matrix, higher order moments, etc.)  The structure of the data set, input-output relation, clusters, etc. Different data representations: how to represent data: what may be an appropriate representation. Visualization. The role of machine learning in data analysis.

Lecture 3. (12. sept. ) Supervised learning for constructing an input-output relation. The simple linear case: linear regression: the basic principle of LS, ML, and MAP (Bayesian) approaches. The LS solution.

Lecture 4. (16. sept. ) Linear regression a summary. Weighted LS estimation. The role of regularization. l2 and I1 regularizations Ridge regression, LASSO. ML estimation using Gaussian observation noise. Gaussian-Markov estimation. Bishop: Pattern recognition and machine learning 3 fejezet a 3.3 szakaszig. A 3.3 szakasz alapgondolata. Bayesian linear regression. The slides of this lecture can be found here.

Csatolt fájlok: Statisztikai tesztek, rövid összefoglaló Lineáris regresszió témakörhöz kapcsolódó slideok Bishop könyv alapján Lineáris regresszió Lineáris és nemlineáris osztályozási módszerek, Neurális hálók Háttéranyag AP részéhez (Bayes hálók, HMM,...)
© 2010-2019 BME MIT | Hibajelentés | Használati útmutató