Technical guide

Trusting your machine learning model:

our Confusion Matrix

 

How do you trust your machine learning model? How will you know if your predictions will be adequate for your engineering applications? How can you fine tune your model to reach the success criteria best suited to you? 

Access the resource to learn more.

 

Confusion Matrix Thumbnail v2

Resource summary

The confusion matrix is a tool in data science used to evaluate the accuracy of machine learning predictions against the actual outcomes. Essentially, it helps determine if a model meets your needs. 

Understanding the confusion matrix can sometimes be tricky, so we've developed a resource to make things clearer. This includes: 

  • An overview of the confusion matrix 
  • Definitions of key terms such as 'true positive' and 'false negative' 
  • A guide on how to interpret the confusion matrix and choose the best model for your scenario 
  • A worked example on what this could look like for an engineering use case 
  • A glossary of key terms and formulae relevant to the confusion matrix 

We're committed to demystifying complex machine learning concepts through resources like these in our engineering efforts. Stay tuned for further resources on all things machine learning, engineering, and battery testing.