Back to TILs.

Calmcode TIL

MLPClassifier logoMLPClassifier

When you think about neural networks then you'll probably think of libraries like Keras or Pytorch. But it turns out that scikit-learn also supports them!

from sklearn.neural_network import MLPClassifier
from sklearn.neural_network import MLPRegressor

It turns out these Multi-layer Perceptron (MLP) models have a couple of interesting properties too.

  • The classifier supports sparse data as input. This can save a whole lot of memory when dealing with text data.
  • You can train the classifier via the familiar .fit()-call but it also supports .partial_fit()!
  • The classifier can be sped up via the early_stopping parameter. If set to true, it will automatically set aside 10% of training data as validation and terminate training when validation score is not improving.

Benchmark

The classifier is not meant to be a state of the art implementation for neural networks, but we were curious about the performance. So we can a little text classification benchmark with the classifier. Here are some of the results.

setting train time predict time test accuracy memory usage
sklearn 110.5 0.0439 0.9144 138MB
sklear early stopping 25.5 0.0434 0.9174 147MB
keras 15.4 0.607 0.9253 2804MB

There's a few things to note:

  • The classification pipeline involves a step that turns text into numeric features. To keep the memory footprint low, you'd prefer to represent these text features in a sparse array, which scikit-learn supports. However, in the keras implementation we're forced to cast these to dense arrays. This has a big effect on the memory usage.
  • The keras implementation seems to converge a fair bit faster.
  • Making predictions on the test set is a fair faster with the scikit-learn implementation.
  • The early stopping setting seems to be a advisable. At least in this benchmark, it speeds up the training without degrading the accuracy score of the test set.

Back to main.