You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: introduction_to_amazon_algorithms/jumpstart_image_classification/Amazon_JumpStart_Image_Classification.ipynb
+85-9Lines changed: 85 additions & 9 deletions
Original file line number
Diff line number
Diff line change
@@ -41,8 +41,9 @@
41
41
"4. [Fine-tune the pre-trained model on a custom dataset](#4.-Fine-tune-the-pre-trained-model-on-a-custome-dataset)\n",
42
42
" * [Retrieve JumpStart Training artifacts](#4.1.-Retrieve-JumpStart-Training-artifacts)\n",
43
43
" * [Set Training parameters](#4.2.-Set-Training-parameters)\n",
44
-
" * [Start Training](#4.3.-Start-Training)\n",
45
-
" * [Deploy & run Inference on the fine-tuned model](#4.4.-Deploy-&-run-Inference-on-the-fine-tuned-model)"
44
+
" * [Train with Automatic Model Tuning (HPO)](#AMT)\n",
45
+
" * [Start Training](#4.4.-Start-Training)\n",
46
+
" * [Deploy & run Inference on the fine-tuned model](#4.5.-Deploy-&-run-Inference-on-the-fine-tuned-model)"
"### 4.3. Train with Automatic Model Tuning ([HPO](https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning.html)) <a id='AMT'></a>\n",
503
+
"***\n",
504
+
"Amazon SageMaker automatic model tuning, also known as hyperparameter tuning, finds the best version of a model by running many training jobs on your dataset using the algorithm and ranges of hyperparameters that you specify. It then chooses the hyperparameter values that result in a model that performs the best, as measured by a metric that you choose. We will use a [HyperparameterTuner](https://sagemaker.readthedocs.io/en/stable/api/training/tuner.html) object to interact with Amazon SageMaker hyperparameter tuning APIs.\n",
"# You can select from the hyperparameters supported by the model, and configure ranges of values to be searched for training the optimal model.(https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning-define-ranges.html)\n",
"## 4.4. Deploy & run Inference on the fine-tuned model\n",
616
+
"## 4.5. Deploy & run Inference on the fine-tuned model\n",
541
617
"***\n",
542
618
"A trained model does nothing on its own. We now want to use the model to perform inference. For this example, that means predicting the class label of an image. We follow the same steps as in [3. Run inference on the pre-trained model](#3.-Run-inference-on-the-pre-trained-model). We start by retrieving the jumpstart artifacts for deploying an endpoint. However, instead of base_predictor, we deploy the `ic_estimator` that we fine-tuned.\n",
### SageMaker JumpStart Image classification Training & Deployment
2
-
This notebook `Amazon_JumpStart_Image_Classification.ipynb` demos how to fine-tune and deploy a pre-trained image classification model using JumpStart API. It shows how to select a pre-trained image classification model from JumpStart and fine-tune it on an example dataset containing raw .jpg/.png images, while varying training hyperparameters such as learning rate, batch-size and number of epochs. Once the training is complete, the notebook shows how to host the trained model for inference. It also shows how to host the pre-trained model as-it-is without first fine-tuning it.
2
+
This notebook `Amazon_JumpStart_Image_Classification.ipynb` demos how to fine-tune and deploy a pre-trained image classification model using JumpStart API. It shows how to select a pre-trained image classification model from JumpStart and fine-tune it on an example dataset containing raw .jpg/.png images, while varying training hyperparameters such as learning rate, batch-size and number of epochs. AMT (Automatic Model Tuning) is used to search for the best hyperparameters. Once the training is complete, the notebook shows how to host the trained model for inference. It also shows how to host the pre-trained model as-it-is without first fine-tuning it.
"### 3.3. Train with Automatic Model Tuning ([HPO](https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning.html)) <a id='AMT'></a>\n",
610
+
"***\n",
611
+
"Amazon SageMaker automatic model tuning, also known as hyperparameter tuning, finds the best version of a model by running many training jobs on your dataset using the algorithm and ranges of hyperparameters that you specify. It then chooses the hyperparameter values that result in a model that performs the best, as measured by a metric that you choose. We will use a [HyperparameterTuner](https://sagemaker.readthedocs.io/en/stable/api/training/tuner.html) object to interact with Amazon SageMaker hyperparameter tuning APIs.\n",
"# You can select from the hyperparameters supported by the model, and configure ranges of values to be searched for training the optimal model.(https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning-define-ranges.html)\n",
0 commit comments