I've installed xgboost using whl
from http://www.lfd.uci.edu/~gohlke/pythonlibs
When I've tried:
import xgboost
And I've got next message:
d:\program files\python\lib\site-packages\sklearn\cross_validation.py:44: DeprecationWarning: This module was deprecated in version 0.18 in favor of the model_selection module into which all the refactored classes and functions are moved. Also note that the interface of the new CV iterators are different from that of this module. This module will be removed in 0.20. "This module will be removed in 0.20.", DeprecationWarning)
If I print import sklearn
prior to import xgboost
I get no message. I assume this message won't affect the results, but how to avoid it? I've also checked via pip if all packages are up to date.
First:
Yes, it shall not affect the results at the moment.
Next, how to avoid it?
Well, not an easy answer. Software evolves. Dependencies are inevitable. Resistance is futile. Due package & configuration management policies are the only method, how to cope with it.
What works best for similar needs?
Isolate your experiments, VM-isolation is enough to build on
Keep using robust package management - Travis OLIPHANT's Anaconda is a way to go ( + 3 )
Enforce configuration management to avoid bleeding after "new"-packages get their place under sun. Anaconda allows one to "freeze" a controlled
[environment]
, where you define what version / release number of respective packages are kept ( and Anaconda keeps the cross-dependencies for the things there well under control, so we can just benefit from clear and well identifiable[environment]
-s, in which the code used to run and runs further on )Naive advice to always updates to the "latest version" simply can spoil the currently working toys and break things into havoc. It is better to explicitly define / configure / identify / enforce the fully controlled
[environment]
-s to work with and keep that for years ahead.If one is under ISO/EN-9000+, NATO-STANAG AQAP-130+ et al, there are not much better ways to keep walking.