diff --git a/project/Final_Report.ipynb b/project/Final_Report.ipynb index 3443508..bd1e85c 100644 --- a/project/Final_Report.ipynb +++ b/project/Final_Report.ipynb @@ -2785,7 +2785,7 @@ "\n", "- **Feature Importance**: Track temperature, tire degradation, and weather complexity were key predictors of lap times. Being able to Normalize the data and engineer features was crucial to the success of the models. It was difficult to predict the British GP due to the high variability in weather conditions. \n", "\n", - "- **Track-Specific Optimization**: Tailoring models to specific tracks improved performance significantly. Using Hyper paramaters such as n_estimators, max_depth, learning_rate, min_child_samples, subsample, colsample_bytree, reg_alpha, reg_lambda, num_leaves, feature_fraction, bagging_fraction, and bagging_freq were crucial to the success of the models. These hyper parameters allowed us to optimize the models for each track. Feactures ike max_depth which is the maximum depth of the tree, learning_rate which is the rate at which the model learns, and n_estimators which is the number of trees in the model were crucial to the success of the models. Messing with these hyper parameters allowed us to optimize the models for each track. \n", + "- **Track-Specific Optimization**: Tailoring models to specific tracks improved performance significantly. Using Hyper parameters such as n_estimators, max_depth, learning_rate, min_child_samples, subsample, colsample_bytree, reg_alpha, reg_lambda, num_leaves, feature_fraction, bagging_fraction, and bagging_freq were crucial to the success of the models. These hyperparameters allowed us to optimize the models for each track. Features ike max_depth which is the maximum depth of the tree, learning_rate which is the rate at which the model learns, and n_estimators which is the number of trees in the model were crucial to the success of the models. Messing with these hyperparameters allowed us to optimize the models for each track. \n", "\n", "- **Future Work**: Further optimization of hyperparameters and feature engineering could enhance model performance. Exploring additional ensemble methods and deep learning approaches may provide further insights.\n", "\n",