The idea of model fusions is pretty simple. You combine the predictions of a bunch of separate classifiers into a single, uber-classifier prediction, in theory, better than the predictions of its individual constituents.

As my colleague Teresa Álverez mentioned in a previous post, however, this doesn’t typically lead to big gains in performance. We’re typically talking 5-10% improvements even in the best case. In many cases, OptiML will find something as good or better than any combination you could try by hand.


Source de l’article sur DZONE (AI)

0 réponses

Laisser un commentaire

Participez-vous à la discussion?
N'hésitez pas à contribuer!

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée.