Optuna vs Hyperopt: Which Hyperparameter Optimization Library Should You Choose?

栏目: IT技术 · 发布时间: 4年前

内容简介:To train a model on a set of parameters you need to run something like this:For this study, I tried to find the best parameters withinI ran 6 experiments:

To train a model on a set of parameters you need to run something like this:

For this study, I tried to find the best parameters within 100 run budget .

I ran 6 experiments:

  • Random search (from hyperopt) as a reference
  • Tree of Parzen Estimator search strategies for both Optuna and Hyperopt
  • Adaptive TPE from Hyperopt
  • TPE from Optuna with a pruning callback for more runs but within the same time frame. It turns out that 400 runs with pruning takes as much time as 100 runs without it.
  • Optuna with Random Forest surrogate model from skopt.Sampler

You may want to scroll down to the Example Script at the end.

If you want to explore all of those experiments in more detail you can simply go to the experiment dashboard .

Note:

Register for the free tool for experiment tracking and management .

Both Optuna and Hyperopt improved over the random searchwhich is good.

TPE implementation from Optuna was slightly better than Hyperopt’s Adaptive TPE but not by much. On the other hand, when running hyperparameter optimization, those small improvements are exactly what you are going for.

What is interesting is that TPE implementation from HPO and Optuna give vastly different results on this problem. Maybe the cutoff point between good and bad parameter configurations λ is chosen differently or sampling methods have defaults that work better for this particular problem.

Moreover, using pruning decreased training time by 4x . I could run 400 searches in the time that runs 100 without pruning. On the flip side, using pruning got a lower score . It may be different for your problem but it is important to consider that when making a decision whether to use pruning or not.

For this section, I assigned points based on the improvements over the random search strategy.

  • Hyperopt got (0.850–0.844)*100 = 6
  • Optuna got (0.854–0.844)*100 = 10

Experimental results:

Optuna = Hyperopt


以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

精通Java并发编程(第2版)

精通Java并发编程(第2版)

[西] 哈维尔·费尔南德斯·冈萨雷斯 / 唐富年 / 人民邮电出版社 / 2018-10 / 89.00元

Java 提供了一套非常强大的并发API,可以轻松实现任何类型的并发应用程序。本书讲述Java 并发API 最重要的元素,包括执行器框架、Phaser 类、Fork/Join 框架、流API、并发数据结构、同步机制,并展示如何在实际开发中使用它们。此外,本书还介绍了设计并发应用程序的方法论、设计模式、实现良好并发应用程序的提示和技巧、测试并发应用程序的工具和方法,以及如何使用面向Java 虚拟机的......一起来看看 《精通Java并发编程(第2版)》 这本书的介绍吧!

JSON 在线解析
JSON 在线解析

在线 JSON 格式化工具

随机密码生成器
随机密码生成器

多种字符组合密码

SHA 加密
SHA 加密

SHA 加密工具