You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There are still some binary classification trainers to (generalized) linear models. As done in #2506, they all should be typed according to the types learned. They also should not do auto-calibration in API land.
We should also check if making other trainers typed is possible.
EnsembleTrainer
RegressionEnsembleTrainer
MulticlassDataPartitionEnsembleTrainer
MetaMulticlassTrainer
Ova related things
Stacking
TreeEnsembleFeaturizer
The text was updated successfully, but these errors were encountered:
OK. The trouble with ensembling and meta-multiclass in particular is that the existing code has some provision (at least in its internals) to handle multiple types of trainers. This makes sense to me: is the type of trainer and its configuration for every class in, say, OVA, definitely going to be the same? It's not obvious to me that it must be so. So things like that I might prefer to keep as-is until we figure out a proper design. (Just so long as we aren't painting ourselves into a corner design wise to do so.)
There are still some binary classification trainers to (generalized) linear models. As done in #2506, they all should be typed according to the types learned. They also should not do auto-calibration in API land.
We should also check if making other trainers typed is possible.
The text was updated successfully, but these errors were encountered: