As of today, FastAPI is the most popular web framework for building microservices with python 3.6+ versions. By deploying machine learning models as microservice-based architecture, we make code components re-usable, highly maintained, ease of testing, and of-course the quick response time. FastAPI is built over ASGI (Asynchronous Server Gateway Interface) instead of flask’s WSGI (Web Server Gateway Interface). This is the reason it is faster as compared to flask-based APIs.
The term Machine Learning was given by Arthur Samuel in the year 1959 and he defined it as the “field of study that gives computers the ability to learn without being explicitly programmed”. Now Machine Learning is helping organizations to take data driven decisions rather completely relying on the experience driven decisions.
Further in this article, we will discuss the benefits of Machine Learning in the development process of a mobile app and everything which revolves around Machine Learning and its use in developing a mobile app.
In the field of Machine Learning, logistic regression is still the top choice for classification problems. It is simple yet efficient algorithm which produces accurate models in most of the cases. In its basic form, it uses the logistic function to calculate the probability score which helps to classify the binary dependent variable to its respective class. Logistic regression is the transformed form of the linear regression. In this post I have explained the end to end step involved in the classification machine learning problems using the logistic regression and also performed the detailed analysis of the model output with various performance parameters.
Data Science and Machine Learning Articles | Yearly round-up 2019
Boosting helps to improve the accuracy of any given machine learning algorithm. It is algorithm independent so we can apply it with any learning algorithms. It is not used to reduce the model variance.
Boosting involves many sequential
iterations to strengthen the model accuracy, hence it becomes computationally costly.
Ensemble Learning says, if we can build multiple models then why to select the best one why not top 2, again why not top 3 and why not top 10. Then if you find top 10 deploy all 10 models. And when new data comes, make a prediction from all 10 models and combine the predictions and finally make a joint prediction. This is the key idea of ensemble learning.