You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: NEWS.md
+2-6Lines changed: 2 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -11,9 +11,9 @@
11
11
12
12
* The `liquidSVM` engine for `svm_rbf()` was deprecated due to that package's removal from CRAN. (#425)
13
13
14
-
*New model specification `survival_reg()` for the new mode `"censored regression"` (#444). `surv_reg()` is now soft-deprecated (#448).
14
+
*The xgboost engine for boosted trees was translating `mtry` to xgboost's `colsample_bytree`. We now map `mtry` to `colsample_bynode` since that is more consistent with how random forest works. `colsample_bytree` can still be optimized by passing it in as an engine argument. `colsample_bynode` was added to xgboost after the `parsnip` package code was written. (#495)
15
15
16
-
*New model specification `proportional_hazards()` for the `"censored regression"` mode (#451).
16
+
*For xgboost boosting, `mtry` and `colsample_bytree` can be passed as integer counts or proportions while `subsample` and `validation` should be proportions. `xgb_train()`now has a new option `counts`for state what scale `mtry` and `colsample_bytree` are being used. (#461)
17
17
18
18
## Other Changes
19
19
@@ -23,12 +23,8 @@
23
23
24
24
* Re-organized model documentation for `update` methods (#479).
25
25
26
-
27
-
28
26
*`generics::required_pkgs()` was extended for `parsnip` objects.
29
27
30
-
31
-
32
28
# parsnip 0.1.5
33
29
34
30
* An RStudio add-in is available that makes writing multiple `parsnip` model specifications to the source window. It can be accessed via the IDE addin menus or by calling `parsnip_addin()`.
Copy file name to clipboardExpand all lines: man/rmd/boost-tree.Rmd
+2-3Lines changed: 2 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -38,8 +38,7 @@ mod_param <-
38
38
update(sample_size = sample_prop(c(0.4, 0.9)))
39
39
```
40
40
41
-
For this engine, tuning over `trees` is very efficient since the same model
42
-
object can be used to make predictions over multiple values of `trees`.
41
+
For this engine, tuning over `trees` is very efficient since the same model object can be used to make predictions over multiple values of `trees`.
43
42
44
43
Note that `xgboost` models require that non-numeric predictors (e.g., factors) must be converted to dummy variables or some other numeric representation. By default, when using `fit()` with `xgboost`, a one-hot encoding is used to convert factor predictors to indicator variables.
0 commit comments