10 Key Sales Forecasting Techniques for 2025
Jul 2, 2025 in Machine Learning
Discover the top 10 sales forecasting techniques for 2025. This guide covers everything from time series to machine learning for ultimate accuracy.
Not a member? Sign up now
How to build ML models when classes are dynamic and ridiculously large
Kelwin on Sep 22, 2022
Let’s face it, we all have worked on an ML project where we had to predict a ridiculously high number of classes. Large enough to make the number of observations per class into an embarrassingly small subset. Most people model these tasks as a multiclass classification problem where, for each input observation, we must predict the most likely class (or the class probabilities).
Examples of such tasks are predicting the model of a car, the species of an animal, the intent of a user on a chat, the SIC/NAICS code of a company, and the product on a marketplace picture, among many others.
A dynamic number of classes also characterizes these examples. For example, let’s say we are training a Computer Vision model to recognize the item on a photo for an autonomous retail store. Every day, new products are launched to the market. If you go with the traditional approach, you must train a new model daily to keep up with the catalog.
This would make the model maintenance (and operations) go wild! You don’t want that!
Our trick for this kind of model is converting classes into part of the question. So, instead of training a multiclass classification model that predicts:
What’s the class of the observation? – a categorical question
we ask the question:
Is this observation from a given category? – a yes or no question.
I like to call this trick flipping your model upside down, making the outcome part of the inputs.
Technically, we transform our predictive model
into
Then, for any given observation, you just need to ask for all classes and take the one with the highest probability.
Is there a new class? Don’t worry; just ask an additional question next time you need to generate a prediction. No re-training is required. I like to call this trick flipping your model upside down, making the outcome part of the inputs.
Disclaimer: as long as your initial class subset is general enough. Otherwise, just re-train every now and then.
Let’s say you have features describing the classes. Then, you just need to encode the class as the set of features that describe it. For example, in the retail product recognition example, you can characterize the item by its category, brand, weight, size, color, description, ingredients, etc.
However, it’s not so common to have features describing the classes. How would you describe a user’s intent on a chat? How would you describe a car model or an animal?
Yes, it would be possible to do it. But, my bet is that you won’t have access to such data.
What to do in such a situation?
You must agree that you have the features of the entities belonging to that class, right? In that case, you can just get features about the distribution of the observations in that class. Statistical values like the average, minimum, maximum, and variance of the features for the observations in that class. Now you have features describing the class. You’re welcome.
Hey Kelwin, but you know, aren’t features old-fashioned? We all work with deep learning nowadays and leave the model to learn its own features. I’m glad you asked, young grasshopper!
Book a meeting with Kelwin Fernandes
Meet Kelwin Learn MoreYou can train a siamese neural network that answers the question:
Are these two observations from the same class?
Or, in a more formal language:
Now, you can ask the question comparing your new test observation against all training data points, aggregate the probabilities by class (e.g., maximum, average) and return the class with the highest score. Basically, you can just transform a multiclass problem into a similarity learning one.
Are you crazy? That won’t scale at all. Well, it will. First of all, you just need to index all training observations. So, whenever new input arrives, you just run your neural network on the input instance to get its latent features plus a simple nearest neighbor comparison against all the other data points.
Still, can you imagine doing that over millions of observations? Of course not, but you can always choose pivots that represent your class properly using any technique, such as k-medoids on the latent space. Easy peasy.
Now, you have a scalable model that adjusts to new classes without the need for re-training.
We have used this trick in several industries and use cases, which always pays for itself.
You gain so much operational efficiency, plus mitigating the problem of classes with low frequency.
Is a class no longer relevant? Remove its observations from your index.
Is there any new class? Add new observations to your index.
As easy as that!
There are a couple of additional tricks we can teach you, but you will need to wait for another article. I have to leave. But you don’t. So, subscribe now to our newsletter below to stay tuned.
Like this story?
Special offers, latest news and quality content in your inbox.
Jul 2, 2025 in Machine Learning
Discover the top 10 sales forecasting techniques for 2025. This guide covers everything from time series to machine learning for ultimate accuracy.
Jul 2, 2025 in Technical
Discover key business intelligence dashboard examples to enhance your data insights. Explore trending dashboards for impactful decision-making in 2025.
Jul 2, 2025 in Machine Learning
A practical guide to predicting customer churn. Learn how to build a churn prediction model, from data prep to actionable retention strategies.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checkbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checkbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |