42 results for “topic:kernel-pca”
UnSupervised and Semi-Supervise Anomaly Detection / IsolationForest / KernelPCA Detection / ADOA / etc.
Front-end speech processing aims at extracting proper features from short- term segments of a speech utterance, known as frames. It is a pre-requisite step toward any pattern recognition problem employing speech or audio (e.g., music). Here, we are interesting in voice disorder classification. That is, to develop two-class classifiers, which can discriminate between utterances of a subject suffering from say vocal fold paralysis and utterances of a healthy subject.The mathematical modeling of the speech production system in humans suggests that an all-pole system function is justified [1-3]. As a consequence, linear prediction coefficients (LPCs) constitute a first choice for modeling the magnitute of the short-term spectrum of speech. LPC-derived cepstral coefficients are guaranteed to discriminate between the system (e.g., vocal tract) contribution and that of the excitation. Taking into account the characteristics of the human ear, the mel-frequency cepstral coefficients (MFCCs) emerged as descriptive features of the speech spectral envelope. Similarly to MFCCs, the perceptual linear prediction coefficients (PLPs) could also be derived. The aforementioned sort of speaking tradi- tional features will be tested against agnostic-features extracted by convolu- tive neural networks (CNNs) (e.g., auto-encoders) [4]. The pattern recognition step will be based on Gaussian Mixture Model based classifiers,K-nearest neighbor classifiers, Bayes classifiers, as well as Deep Neural Networks. The Massachussets Eye and Ear Infirmary Dataset (MEEI-Dataset) [5] will be exploited. At the application level, a library for feature extraction and classification in Python will be developed. Credible publicly available resources will be 1used toward achieving our goal, such as KALDI. Comparisons will be made against [6-8].
In This repository I made some simple to complex methods in machine learning. Here I try to build template style code.
The code for Principal Component Analysis (PCA), dual PCA, Kernel PCA, Supervised PCA (SPCA), dual SPCA, and Kernel SPCA
Application of Deep Learning and Feature Extraction in Software Defect Prediction
Here I've demonstrated how and why should we use PCA, KernelPCA, LDA and t-SNE for dimensionality reduction when we work with higher dimensional datasets.
My notes for Prof. Klaus Obermayer's "Machine Intelligence 2 - Unsupervised Learning" course at the TU Berlin
Re-Implementation of Gaussian Process Latent Variable Model algorithm & performance assessment against Kernel-PCA
Implementation of Bayesian PCA [Bishop][1999] And Bayesian Kernel PCA
Source Code & Datasets for "Vertical Federated Principal Component Analysis and Its Kernel Extension on Feature-wise Distributed Data"
Application of principal component analysis capturing non-linearity in the data using kernel approach
[NeurIPS 2024] Kernel PCA for Out-of-Distribution Detection
Repository for the code of the "Introduction to Machine Learning" (IML) lecture at the "Learning & Adaptive Systems Group" at ETH Zurich.
Performed different tasks such as data preprocessing, cleaning, classification, and feature extraction/reduction on wine dataset.
Python package for plug and play dimensionality reduction techniques and data visualization in 2D or 3D.
Low-dimensional vector representations via kernel PCA with rational kernels
The code for Image Structural Component Analysis (ISCA) and Kernel ISCA
My Machine Learning course projects
Machine learning algorithms done from scratch in Python with Numpy/Scipy
5th semester project concerning feature engineering and nonlinear dimensionality reduction in particular.
Unsupervised machine learning algorithm. Classical and kernel methods for non-linearly seperable data.
Data Science Portfolio
Continuation of my machine learning works based on Subjects....starting with Evaluating Classification Models Performance
Notes, homework and project for PSU's STAT 672 Winter 2020
📃 Exploration of Nonlinear Component Analysis as a Kernel Eigenvalue Problem
UML dimensionality reduction and clustering models for predicting if a banknote is genuine or not based on the dataset from OpenML containing wavelet analysis results for genuine and forged banknotes - practical exercise. (Python 3)
Machine Learning assignments from coursework.
Implementation of supervised and unsupervised Machine Learning algorithms in python from scratch!
Analyzing and overcoming the curse of dimensionality and exploring various gradient descent techniques with implementations in R
K-means, Spectral clustering, PCA, and Kernel PCA