# YCML Machine Learning library on Github

Tags: Machine Learning

YCML is a new Machine Learning library available on Github as an Open Source (GPLv3) project. It can be used in iOS and OS X applications, and includes Machine Learning and optimization algorithms.

By Ioannis Chatzikonstantinou.

It is programmed in Objective-C, and can be used in iOS and OS X applications, both Objective-C and Swift. It is still in early development, but several algorithms are already implemented. The underlying calculations are optimised by means of calling BLAS and LAPACK implementations.

The following algorithms are currently available:

Where applicable, regularized versions of the algorithms have been implemented.

YCML also contains some optimization algorithms as support for deriving predictive models, although they can be used for any kind of problem:

Learning Features

Optimization

For more information and to download, visit https://github.com/yconst/YCML.

**YCML**is a new Machine Learning library which is available on Github as an Open Source (GPLv3) project. It is based on YCMatrix, a matrix library that makes use of the Accelerate Framework for improved performance.It is programmed in Objective-C, and can be used in iOS and OS X applications, both Objective-C and Swift. It is still in early development, but several algorithms are already implemented. The underlying calculations are optimised by means of calling BLAS and LAPACK implementations.

The following algorithms are currently available:

- Gradient Descent Backpropagation
- Resilient Backpropagation (RProp)
- Extreme Learning Machines (ELM)
- Forward Selection using Orthogonal Least Squares (for RBF Net)
- Forward Selection using Orthogonal Least Squares with the PRESS statistic

Where applicable, regularized versions of the algorithms have been implemented.

YCML also contains some optimization algorithms as support for deriving predictive models, although they can be used for any kind of problem:

- Gradient Descent (Single-Objective, Unconstrained)
- RProp Gradient Descent (Single-Objective, Unconstrained)
- NSGA-II (Multi-Objective, Constrained)

Learning Features

- Embedded model input/output normalization facility.
- Generic Supervised Learning base class that can accommodate a variety of algorithms.
- Powerful and modular Backprop class, that can be configured for stochastic GD.
- Powerful Dataframe class, with numerous editing functions, that can be converted to/from Matrix.

Optimization

- Separate optimization routines for single- and multi-objective problems.
- Surrogate class that exposes a predictive model as an objective function, useful for optimization.

For more information and to download, visit https://github.com/yconst/YCML.