As documetation said : https://xgboost.readthedocs.io/en/latest/parameter.html#general-parameters
alpha [default=0, alias: reg_alpha]:
L1 regularization term on weights. Increasing this value will make model more conservative.
I'm wondering can alpha be 100,1000? If so, how find the best alpha?
Alpha can range from 0 to Inf. One way of selecting the optimal parameters for an ML task is to test a bunch of different parameters and see which ones produce the best results. For example, see https://towardsdatascience.com/doing-xgboost-hyper-parameter-tuning-the-smart-way-part-1-of-2-f6d255a45dde