Skip to content

Commit 815e056

Browse files
dusenberrymwjkbradley
authored andcommitted
[SPARK-7985] [ML] [MLlib] [Docs] Remove "fittingParamMap" references. Updating ML Doc "Estimator, Transformer, and Param" examples.
Updating ML Doc's *"Estimator, Transformer, and Param"* example to use `model.extractParamMap` instead of `model.fittingParamMap`, which no longer exists. mengxr, I believe this addresses (part of) the *update documentation* TODO list item from [PR 5820](#5820). Author: Mike Dusenberry <[email protected]> Closes #6514 from dusenberrymw/Fix_ML_Doc_Estimator_Transformer_Param_Example and squashes the following commits: 6366e1f [Mike Dusenberry] Updating instances of model.extractParamMap to model.parent.extractParamMap, since the Params of the parent Estimator could possibly differ from thos of the Model. d850e0e [Mike Dusenberry] Removing all references to "fittingParamMap" throughout Spark, since it has been removed. 0480304 [Mike Dusenberry] Updating the ML Doc "Estimator, Transformer, and Param" Java example to use model.extractParamMap() instead of model.fittingParamMap(), which no longer exists. 7d34939 [Mike Dusenberry] Updating ML Doc "Estimator, Transformer, and Param" example to use model.extractParamMap instead of model.fittingParamMap, which no longer exists. (cherry picked from commit ad06727) Signed-off-by: Joseph K. Bradley <[email protected]>
1 parent 139c824 commit 815e056

File tree

11 files changed

+14
-14
lines changed

11 files changed

+14
-14
lines changed

docs/ml-guide.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -207,7 +207,7 @@ val model1 = lr.fit(training.toDF)
207207
// we can view the parameters it used during fit().
208208
// This prints the parameter (name: value) pairs, where names are unique IDs for this
209209
// LogisticRegression instance.
210-
println("Model 1 was fit using parameters: " + model1.fittingParamMap)
210+
println("Model 1 was fit using parameters: " + model1.parent.extractParamMap)
211211

212212
// We may alternatively specify parameters using a ParamMap,
213213
// which supports several methods for specifying parameters.
@@ -222,7 +222,7 @@ val paramMapCombined = paramMap ++ paramMap2
222222
// Now learn a new model using the paramMapCombined parameters.
223223
// paramMapCombined overrides all parameters set earlier via lr.set* methods.
224224
val model2 = lr.fit(training.toDF, paramMapCombined)
225-
println("Model 2 was fit using parameters: " + model2.fittingParamMap)
225+
println("Model 2 was fit using parameters: " + model2.parent.extractParamMap)
226226

227227
// Prepare test data.
228228
val test = sc.parallelize(Seq(
@@ -289,7 +289,7 @@ LogisticRegressionModel model1 = lr.fit(training);
289289
// we can view the parameters it used during fit().
290290
// This prints the parameter (name: value) pairs, where names are unique IDs for this
291291
// LogisticRegression instance.
292-
System.out.println("Model 1 was fit using parameters: " + model1.fittingParamMap());
292+
System.out.println("Model 1 was fit using parameters: " + model1.parent().extractParamMap());
293293

294294
// We may alternatively specify parameters using a ParamMap.
295295
ParamMap paramMap = new ParamMap();
@@ -305,7 +305,7 @@ ParamMap paramMapCombined = paramMap.$plus$plus(paramMap2);
305305
// Now learn a new model using the paramMapCombined parameters.
306306
// paramMapCombined overrides all parameters set earlier via lr.set* methods.
307307
LogisticRegressionModel model2 = lr.fit(training, paramMapCombined);
308-
System.out.println("Model 2 was fit using parameters: " + model2.fittingParamMap());
308+
System.out.println("Model 2 was fit using parameters: " + model2.parent().extractParamMap());
309309

310310
// Prepare test documents.
311311
List<LabeledPoint> localTest = Lists.newArrayList(

mllib/src/main/scala/org/apache/spark/ml/classification/GBTClassifier.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -208,7 +208,7 @@ private[ml] object GBTClassificationModel {
208208
require(oldModel.algo == OldAlgo.Classification, "Cannot convert GradientBoostedTreesModel" +
209209
s" with algo=${oldModel.algo} (old API) to GBTClassificationModel (new API).")
210210
val newTrees = oldModel.trees.map { tree =>
211-
// parent, fittingParamMap for each tree is null since there are no good ways to set these.
211+
// parent for each tree is null since there is no good way to set this.
212212
DecisionTreeRegressionModel.fromOld(tree, null, categoricalFeatures)
213213
}
214214
val uid = if (parent != null) parent.uid else Identifiable.randomUID("gbtc")

mllib/src/main/scala/org/apache/spark/ml/classification/RandomForestClassifier.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -170,7 +170,7 @@ private[ml] object RandomForestClassificationModel {
170170
require(oldModel.algo == OldAlgo.Classification, "Cannot convert RandomForestModel" +
171171
s" with algo=${oldModel.algo} (old API) to RandomForestClassificationModel (new API).")
172172
val newTrees = oldModel.trees.map { tree =>
173-
// parent, fittingParamMap for each tree is null since there are no good ways to set these.
173+
// parent for each tree is null since there is no good way to set this.
174174
DecisionTreeClassificationModel.fromOld(tree, null, categoricalFeatures)
175175
}
176176
val uid = if (parent != null) parent.uid else Identifiable.randomUID("rfc")

mllib/src/main/scala/org/apache/spark/ml/regression/GBTRegressor.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -198,7 +198,7 @@ private[ml] object GBTRegressionModel {
198198
require(oldModel.algo == OldAlgo.Regression, "Cannot convert GradientBoostedTreesModel" +
199199
s" with algo=${oldModel.algo} (old API) to GBTRegressionModel (new API).")
200200
val newTrees = oldModel.trees.map { tree =>
201-
// parent, fittingParamMap for each tree is null since there are no good ways to set these.
201+
// parent for each tree is null since there is no good way to set this.
202202
DecisionTreeRegressionModel.fromOld(tree, null, categoricalFeatures)
203203
}
204204
val uid = if (parent != null) parent.uid else Identifiable.randomUID("gbtr")

mllib/src/main/scala/org/apache/spark/ml/regression/RandomForestRegressor.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -152,7 +152,7 @@ private[ml] object RandomForestRegressionModel {
152152
require(oldModel.algo == OldAlgo.Regression, "Cannot convert RandomForestModel" +
153153
s" with algo=${oldModel.algo} (old API) to RandomForestRegressionModel (new API).")
154154
val newTrees = oldModel.trees.map { tree =>
155-
// parent, fittingParamMap for each tree is null since there are no good ways to set these.
155+
// parent for each tree is null since there is no good way to set this.
156156
DecisionTreeRegressionModel.fromOld(tree, null, categoricalFeatures)
157157
}
158158
new RandomForestRegressionModel(parent.uid, newTrees)

mllib/src/test/scala/org/apache/spark/ml/classification/DecisionTreeClassifierSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -266,7 +266,7 @@ private[ml] object DecisionTreeClassifierSuite extends FunSuite {
266266
val oldTree = OldDecisionTree.train(data, oldStrategy)
267267
val newData: DataFrame = TreeTests.setMetadata(data, categoricalFeatures, numClasses)
268268
val newTree = dt.fit(newData)
269-
// Use parent, fittingParamMap from newTree since these are not checked anyways.
269+
// Use parent from newTree since this is not checked anyways.
270270
val oldTreeAsNew = DecisionTreeClassificationModel.fromOld(
271271
oldTree, newTree.parent.asInstanceOf[DecisionTreeClassifier], categoricalFeatures)
272272
TreeTests.checkEqual(oldTreeAsNew, newTree)

mllib/src/test/scala/org/apache/spark/ml/classification/GBTClassifierSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -128,7 +128,7 @@ private object GBTClassifierSuite {
128128
val oldModel = oldGBT.run(data)
129129
val newData: DataFrame = TreeTests.setMetadata(data, categoricalFeatures, numClasses = 2)
130130
val newModel = gbt.fit(newData)
131-
// Use parent, fittingParamMap from newTree since these are not checked anyways.
131+
// Use parent from newTree since this is not checked anyways.
132132
val oldModelAsNew = GBTClassificationModel.fromOld(
133133
oldModel, newModel.parent.asInstanceOf[GBTClassifier], categoricalFeatures)
134134
TreeTests.checkEqual(oldModelAsNew, newModel)

mllib/src/test/scala/org/apache/spark/ml/classification/RandomForestClassifierSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -158,7 +158,7 @@ private object RandomForestClassifierSuite {
158158
data, oldStrategy, rf.getNumTrees, rf.getFeatureSubsetStrategy, rf.getSeed.toInt)
159159
val newData: DataFrame = TreeTests.setMetadata(data, categoricalFeatures, numClasses)
160160
val newModel = rf.fit(newData)
161-
// Use parent, fittingParamMap from newTree since these are not checked anyways.
161+
// Use parent from newTree since this is not checked anyways.
162162
val oldModelAsNew = RandomForestClassificationModel.fromOld(
163163
oldModel, newModel.parent.asInstanceOf[RandomForestClassifier], categoricalFeatures)
164164
TreeTests.checkEqual(oldModelAsNew, newModel)

mllib/src/test/scala/org/apache/spark/ml/regression/DecisionTreeRegressorSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -83,7 +83,7 @@ private[ml] object DecisionTreeRegressorSuite extends FunSuite {
8383
val oldTree = OldDecisionTree.train(data, oldStrategy)
8484
val newData: DataFrame = TreeTests.setMetadata(data, categoricalFeatures, numClasses = 0)
8585
val newTree = dt.fit(newData)
86-
// Use parent, fittingParamMap from newTree since these are not checked anyways.
86+
// Use parent from newTree since this is not checked anyways.
8787
val oldTreeAsNew = DecisionTreeRegressionModel.fromOld(
8888
oldTree, newTree.parent.asInstanceOf[DecisionTreeRegressor], categoricalFeatures)
8989
TreeTests.checkEqual(oldTreeAsNew, newTree)

mllib/src/test/scala/org/apache/spark/ml/regression/GBTRegressorSuite.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -129,7 +129,7 @@ private object GBTRegressorSuite {
129129
val oldModel = oldGBT.run(data)
130130
val newData: DataFrame = TreeTests.setMetadata(data, categoricalFeatures, numClasses = 0)
131131
val newModel = gbt.fit(newData)
132-
// Use parent, fittingParamMap from newTree since these are not checked anyways.
132+
// Use parent from newTree since this is not checked anyways.
133133
val oldModelAsNew = GBTRegressionModel.fromOld(
134134
oldModel, newModel.parent.asInstanceOf[GBTRegressor], categoricalFeatures)
135135
TreeTests.checkEqual(oldModelAsNew, newModel)

0 commit comments

Comments
 (0)