Some ML systems (like decision trees) can give you a comprehenable way that they made the decision (would give you a list of if conditionals). Unfortunately many can't do this (random forests) . Having an AI that can explain itself in every situation why it does it also has to do with the underlying techniques. For instance random forests generate a random sample of the features and creates decision trees. So the explanation wouldn't be much useful to you.