Skip to content

MSE or "R"MSE in nas/tabular benchmarks? #184

@Deathn0t

Description

@Deathn0t

Hello,

I noticed that in the original repo that generated the data here the mean_squared_err function is used.

But then, in the HPOBench the valid_rmse_... is used see here

After trying to reproduce a constant mean predictor (from the original regression dataset), at least for the parkinson telemonitoring benchmark, I have an estimated mse of about 0.9. (I did not find any sqrt call anywhere also) which is of the same order of magnitude as performances reported in the benchmark.

Do you have any guess about if it is mse or rmse?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions