Bezdek JC, Ehrlich R, Full W (1984) FCM: The fuzzy c-means clustering algorithm. Computer and Geosciences 10, 191–203.
Bishop C (2006) Pattern recognition and machine learning. Springer-Verlag
Boyd S, Vandenberghe L (2004) Convex optimization. Cambridge University Press
Breiman L, Friedman J, Stone CJ, Olshen RA (1984) Classification and regression trees. Chapman; Hall/CRC.
Campello RJGB, Moulavi D, Zimek A, Sander J (2015) Hierarchical density estimates for data clustering, visualization, and outlier detection. ACM Transactions on Knowledge Discovery from Data 10, 5:1–5:51.
Cena A, Gagolewski M (2020) Genie+OWA: Robustifying hierarchical clustering with OWA-based linkages. Information Sciences 520, 324–336.
Cortez P, Cerdeira A, Almeida F, Matos T, Reis J (2009) Modeling wine preferences by data mining from physicochemical properties. Decision Support Systems 47, 547–553.
Deisenroth MP, Faisal AA, Ong CS (2020) Mathematics for machine learning. Cambridge University Press
Ester M, Kriegel H-P, Sander J, Xu X (1996) A density-based algorithm for discovering clusters in large spatial databases with noise Proc. KDD’96, pp. 226–231.
Fletcher R (2008) Practical methods of optimization. Wiley.
Gagolewski M, Bartoszuk M, Cena A (2016) Genie: A new, fast, and outlier-resistant hierarchical clustering algorithm. Information Sciences 363, 8–23.
Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning. Addison-Wesley.
Goodfellow I, Bengio Y, Courville A (2016) Deep learning. MIT Press
Harper FM, Konstan JA (2015) The MovieLens datasets: History and context. ACM Transactions on Interactive Intelligent Systems 5, 19:1–19:19.
Hastie T, Tibshirani R, Friedman J (2017) The elements of statistical learning. Springer-Verlag
Herlocker JL, Konstan JA, Terveen LG, Riedl JT (2004) Evaluating collaborative filtering recommender systems. ACM Transactions on Information Systems 22, 5–53.
Hubert L, Arabie P (1985) Comparing partitions. Journal of Classification 2, 193–218.
James G, Witten D, Hastie T, Tibshirani R (2017) An introduction to statistical learning with applications in R. Springer-Verlag
Koren Y (2009) The BellKor solution to the Netflix grand prize.
Ling RF (1973) A probability theory of cluster analysis. Journal of the American Statistical Association 68, 159–164.
Lü L et al. (2012) Recommender systems. Physics Reports 519, 1–49.
Müller AC, Nowozin S, Lampert CH (2012) Information theoretic clustering using minimum spanning trees Proc. German conference on pattern recognition,
Ng AY, Jordan MI, Weiss Y (2001) On spectral clustering: Analysis and an algorithm Proc. Advances in neural information processing systems 14 (NIPS’01),
Nocedal J, Wright SJ (2006) Numerical optimization. Springer.
Peng RD (2019) R programming for data science.
Piotte M, Chabbert M (2009) The Pragmatic Theory solution to the Netflix grand prize.
Quinlan R (1986) Induction of decision trees. Machine Learning 1, 81–106.
Quinlan R (1993) C4.5: Programs for machine learning. Morgan Kaufmann Publishers.
R Development Core Team (2021) R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria
Rezaei M, Fränti P (2016) Set-matching measures for external cluster validity. IEEE Transactions on Knowledge and Data Engineering 28, 2173–2186.
Ricci F, Rokach L, Shapira B, Kantor P (eds) (2011) Recommender systems handbook. Springer
Sarle WS et al. (eds) (2002) The FAQ.
Simon D (2013) Evolutionary optimization algorithms: Biologically-inspired and population-based approaches to computer intelligence. Wiley.
Therneau TM, Atkinson EJ (2019) An introduction to recursive partitioning using the RPART routines.
Töscher A, Jahrer M, Bell RM (2009) The BigChaos solution to the Netflix grand prize.
Venables WN, Smith DM, R Core Team (2021) An introduction to R.
Wickham H, Grolemund G (2017) R for data science. O’Reilly
Zhang T, Ramakrishnan R, Livny M (1996) BIRCH: An efficient data clustering method for large databases Proc. ACM SIGMOD international conference on management of data – SIGMOD’96, pp. 103–114.