Algorithms can be very powerful tools to process even large masses of heterogeneous data and arrive at interesting results. Artificial intelligence (AI) in this sense makes the lion’s share because it encompasses a family of algorithms that allow you to find a result of various kinds (a number, a cluster, etc.) from some input data. Leaving aside scientific research or image and video analysis where the amount of data is significant and in most cases, we can talk about Big Data, in all other areas there is often little data, some of it incorrect and in general not suitable for algorithms that require a high-quality database to work, but also quantity.

GLM vs AI, what’s the General Linear Modelling?

When you have to make an analysis in which the depth and the number of information is not so high you are forced to give up the algorithms? In reality, in some cases, a solution can still be found. Some statistical methods allow making accurate analysis of high value for companies. In this article, the focus is on GLM (General Linear Modelling). There is a wide variety of models, but this is certainly one of the most famous and widely used. Suppose we want to estimate (forecast) the future sales volumes of our eCommerce and we have the data related to the transactions that occurred in the last 48 months. This is not a problem of scarce data, but if the database refers to about a thousand transactions in all on a base of 30 products, it is clear that we cannot and should not go down the road of AI.

GLM vs AI, how it works the General Linear Modelling?

Without going into mathematical details, the GLM allows to give weight to all the columns of the transaction file and finally estimate the value of an “objective” variable that is given by the sales volume, for example daily. The resulting model is a linear one, that is representable as a straight line. Once that it comes characterized as a model it is possible to estimate the “quality” through the use of indicators like the P-value.

For those who don’t deal with data analysis daily, these basic indications may already generate some headaches, but it is not necessary to be intimidated by data science. It’s essential to have the support of experts in the field, but it is possible to set up such a system. It’s a matter of creating the right IT flows to feed the model and implement it, making sure that the results can be accessed through a dedicated interface or third-party applications.

GLM vs IA, pros, cosns and differences

Why then do we only hear about Artificial Intelligence as a solution for optimizing business choices? In general, is AI or GLM better? There is no correct answer in an absolute sense, certainly, it is possible to proceed with some considerations that can be shared by anyone involved in data analysis:

  • The GLM is a simpler methodology and needs a smaller amount of data to generate results with an acceptable margin of error.
  • If you have a large amount of data to analyze, the results obtained with AI are better and also the speed of adaptation to changes in data patterns is significantly higher (as, for example, in response to changes in purchasing habits and consumption related to the pandemic. It is no coincidence that 2020 is considered year zero for estimators as data from previous years are in many cases unusable and misleading in predictive analysis).

GLM vs Ai, use cases

But, GLM, in addition to performing predictive analytics, in what areas can it be used?

As with AI, the opportunities are many, for example:

  • Calculate prices by optimizing the straight line resulting from the model
  • Quantitatively estimate the impact of variables on a given event (e.g. impact of holidays on sales)

The choice to use GLM may also be dictated by a desire to get out of a situation of immobility in data analysis. The process often adopted is as follows:

  1. The company has an insufficient amount of data to use AI algorithms.
  2. It chooses to adopt GLM for analysis.
  3. In the meantime, data acquisition processes are implemented and improved and work is done to optimize the organization of data acquisition processes
  4. Quantity and quality of data collected reaches the optimal level for use of AI tools

In summary, the GLM can help to better understand the data that a company has at its disposal and it is a better solution than waiting to act and perform the analysis while waiting to acquire enough data to use Artificial Intelligence.

Leave a Reply

Your email address will not be published.