Spatial Explanations: Unlocking Insights with Occlusions

Spatial Explanations with Occlusions: In computer vision, businesses must grasp the workings of image models to fully leverage visual data. Our simple method called spatial explanations with occlusions, helps achieve a deeper understanding. By employing spatial occlusions across images, this technique unveils critical areas that significantly influence the model’s predictions.”

What to do with these insights, you may ask. Having a deeper understanding of your system is the first step to optimizing your business. You can use these insights for 2 purposes:

  • Ensure AI Validation: During your initial experiences, you may encounter low-quality data, such as small datasets, biased information, or a lack of representativeness. As a result, your model might focus on areas that you know are irrelevant to your system. This can happen no matter how great the metrics on the test set are. Using visual explanations, you can quickly assess how trustworthy your AI is.
  • Learn with the AI: Learn from your model which factors are most relevant for decision-making. In this way, you will have more control over your system, quickly reaching business optimization.

 

How to get clarity from Occlusions?

Spatial explanations rely on the concept of occlusion, where specific areas of an image are masked or occluded to observe their impact on the model’s predictions. To do so, a patch of the original image is replaced with the average values from the dataset in the corresponding location. The original image and the image with the occlusion are analyzed by the predictive model and the difference between those predictions is used as a proxy of the region’s importance. The higher the difference, the greater the impact of the region. By repeating this process with patches from different locations and of different sizes, it is possible to build a heatmap, where the temperature represents the impact on the predictions.

Replacing the occlusion with average values from the dataset is particularly effective in industrial settings, where cameras and most objects remain fixed. In scenarios where the average patch occlusion is not suitable, you can employ alternative methods. Here are some other options:

  • Occluding with a black patch – We don’t recommend this one since too many zeros on the input might deactivate chains of neural activations, returning odd predictions.
  • Employing advanced inpainting techniques – Returns more reliable results but creates the need of relying on a third-party tool
  • Decreasing the resolution – Instead of replacing the patch with different information, add some blur or pixelate that patch, decreasing the volume of information without creating odd edges and textures in the image.

This adaptability allows the spatial occlusions explanation method to be tailored to the specific needs of your business.

 

Some tricks up your sleeve

Well done! Now you know the rationale behind our method! What about a few more tips to get the best from your explainable heat maps?

  1. Mitigating Grid Patterns: Using larger strides can lead to grid patterns, reducing their interpretability. To overcome this challenge, you have two options. Firstly, you can reduce the stride, which provides clearer explanations but increases computational time. Alternatively, applying a Gaussian Blur to the occluded regions helps to smooth out the patterns, resulting in smoother color transitions.
  2. Weighted Averaging: We also enhanced our spatial explanations by incorporating an inversely proportional weight based on the patch size. This technique ensures that the importance of each patch is appropriately considered, leading to more accurate and reliable explanations.
  3. Addressing Repetitive Explanations: In industrial contexts, explanations can sometimes become repetitive, consistently highlighting the same locations. We recommend comparing the image-based explanation values to the dataset’s average explanation using either the absolute difference. This provides valuable insights into regions that deviate from the norm, enabling a deeper understanding of their distinctiveness.

 

Do you want to further discuss this idea?

Book a meeting with Paulo Maia

Meet Paulo Learn More

Conclusion

Spatial explanations offer a powerful approach to interpreting image models and extracting valuable insights from visual data. By employing spatial occlusions, businesses can unravel the inner workings of their models, enabling informed decision-making and enhanced trust. Embrace the power of spatial explanations to unlock the full potential of your image models. Contact us if you’re ready to embark on a journey of comprehensive understanding and actionable insights.

Like this story?

Subscribe to Our Newsletter

Special offers, latest news and quality content in your inbox once per month.

Signup single post

Consent(Required)
This field is for validation purposes and should be left unchanged.

Recommended Articles

Article
Can Your Business Optimize AI Predictive Models?

Predictive models are transforming the AI landscape. They can forecast future events, identify past occurrences, and even predict present situations. However, building a successful predictive model is not as simple as it seems. To achieve an effective predictive model, you need to consider three crucial moments: the prediction time, the prediction window, and the data […]

Read More
Article
Is Your Business Ready for Generative AI Risks?

Generative AI is a powerful tool that many companies are rushing to incorporate into their operations. However, it’s crucial to understand the possible risks associated with this technology. In this article, we’ll discuss the top nine risks that could impact your business’s readiness for AI integration. Stay ahead of the curve, and make sure you’re […]

Read More
Article
Can the STAR Framework Streamline Your AI Projects?

As a manager dealing with AI projects, you may often find yourself overwhelmed. The constant addition of promising projects to the backlog can lead to a mounting technical debt within your team, forcing you to neglect the core aspects of your business. Here at NILG.AI, we have a solution for this challenge: the STAR framework. […]

Read More