Not necessarily. We are fast moving into an era where Humans and AI agents will be working together as teams, hence the huge interest in Human Aware AI. Which is why explanations are important because it is important for the human agents to know and understand the results of their AI agents. So if a human is working with an AI agent that is 80% accurate and with explanations, he can probably use the results of the algorithm more effectively, finish his task and maybe take the accuracy all the way up to 95%, whereas with the 90% accurate model the human will be left clueless and the macro-task may remain unaccomplished.

For the synergy of human robot collaboration explanation will be necessary.