Spatial Explanations: Unlocking Insights with Occlusions

Spatial Explanations with Occlusions: In computer vision, businesses must grasp the workings of image models to fully leverage visual data. Our simple method called spatial explanations with occlusions, helps achieve a deeper understanding. By employing spatial occlusions across images, this technique unveils critical areas that significantly influence the model’s predictions.”

What to do with these insights, you may ask. Having a deeper understanding of your system is the first step to optimizing your business. You can use these insights for 2 purposes:

  • Ensure AI Validation: During your initial experiences, you may encounter low-quality data, such as small datasets, biased information, or a lack of representativeness. As a result, your model might focus on areas that you know are irrelevant to your system. This can happen no matter how great the metrics on the test set are. Using visual explanations, you can quickly assess how trustworthy your AI is.
  • Learn with the AI: Learn from your model which factors are most relevant for decision-making. In this way, you will have more control over your system, quickly reaching business optimization.

 

How to get clarity from Occlusions?

Spatial explanations rely on the concept of occlusion, where specific areas of an image are masked or occluded to observe their impact on the model’s predictions. To do so, a patch of the original image is replaced with the average values from the dataset in the corresponding location. The original image and the image with the occlusion are analyzed by the predictive model and the difference between those predictions is used as a proxy of the region’s importance. The higher the difference, the greater the impact of the region. By repeating this process with patches from different locations and of different sizes, it is possible to build a heatmap, where the temperature represents the impact on the predictions.

Replacing the occlusion with average values from the dataset is particularly effective in industrial settings, where cameras and most objects remain fixed. In scenarios where the average patch occlusion is not suitable, you can employ alternative methods. Here are some other options:

  • Occluding with a black patch – We don’t recommend this one since too many zeros on the input might deactivate chains of neural activations, returning odd predictions.
  • Employing advanced inpainting techniques – Returns more reliable results but creates the need of relying on a third-party tool
  • Decreasing the resolution – Instead of replacing the patch with different information, add some blur or pixelate that patch, decreasing the volume of information without creating odd edges and textures in the image.

This adaptability allows the spatial occlusions explanation method to be tailored to the specific needs of your business.

 

Some tricks up your sleeve

Well done! Now you know the rationale behind our method! What about a few more tips to get the best from your explainable heat maps?

  1. Mitigating Grid Patterns: Using larger strides can lead to grid patterns, reducing their interpretability. To overcome this challenge, you have two options. Firstly, you can reduce the stride, which provides clearer explanations but increases computational time. Alternatively, applying a Gaussian Blur to the occluded regions helps to smooth out the patterns, resulting in smoother color transitions.
  2. Weighted Averaging: We also enhanced our spatial explanations by incorporating an inversely proportional weight based on the patch size. This technique ensures that the importance of each patch is appropriately considered, leading to more accurate and reliable explanations.
  3. Addressing Repetitive Explanations: In industrial contexts, explanations can sometimes become repetitive, consistently highlighting the same locations. We recommend comparing the image-based explanation values to the dataset’s average explanation using either the absolute difference. This provides valuable insights into regions that deviate from the norm, enabling a deeper understanding of their distinctiveness.

 

Do you want to further discuss this idea?

Book a meeting with Francisca Morgado

Meet Francisca Learn More

Conclusion

Spatial explanations offer a powerful approach to interpreting image models and extracting valuable insights from visual data. By employing spatial occlusions, businesses can unravel the inner workings of their models, enabling informed decision-making and enhanced trust. Embrace the power of spatial explanations to unlock the full potential of your image models. Contact us if you’re ready to embark on a journey of comprehensive understanding and actionable insights.

Like this story?

Subscribe to Our Newsletter

Special offers, latest news and quality content in your inbox once per month.

Signup single post

Consent(Required)
This field is for validation purposes and should be left unchanged.

Recommended Articles

Article
Business-centric AI: A New Perspective for Your Company

Coping with the challenge of integrating AI into your business? You’re not alone. Many companies struggle to find the right approach to AI, often getting lost in technical details or data management issues. However, there’s a solution that transcends these common pitfalls: Business-centric AI. This transformative strategy is the perfect way to align your core […]

Read More
Article
Long-term vs. Short-term Predictions in Machine Learning

When building a machine learning model, one of the most common questions is whether to opt for long-term or short-term predictions. In other words, should you build a model that forecasts an event tomorrow or a month from now? Our article will demystify this critical decision-making process. We’ll walk you through a strategic approach that […]

Read More
Article
Ditch the Crystal Ball: Reverse-Engineering with Machine Learning

  Machine Learning models are estimators – which means they can be used not only to predict unknowns in your business but also to reverse-engineer complex business processes. As part of this blog post, you will learn how to identify these potential points of improvement, prioritize them, and create models to estimate them. Identification How […]

Read More