Metaphors are useful for avoiding a complete understanding something. Avoiding a full understanding in favor of being content with some simple abstract idea. Feynman refuses to give examples of what physical phenomena are like because there is nothing like an electron spinning around a nucleus that can give a better understanding of it. It is not like planets around the sun. The forces involved are completely different!
Metaphors have their place. In a literary sense they create a connection with your audience and convey feeling and empathy. In a technical setting a metaphor can be used effectively to highlight fundamental properties of a model. They provide an excellent communication device. But beware of their use if you are searching for a precise understanding.
Peter Norvig presents an excellent dive into Models and Theories from a computer scientist's perspective. The key theme to me is that rich data produces better models than clever algorithms. There is a large class of problems that can be solved very well by models that are statistical in nature and not intuitively understandable by virtue of their size. Computers are allowing us to build models in ways that are beyond metaphor.
At Outpace we use simple machine learning over large datasets to personalize content. It makes our customers lots of money. We routinely see a 20-60% lift in conversions, compared to business as usual. The value of data is very obvious in such a setting. We literally convert data into money.
In science, access to data leads to amazing progress. Examples such as the human genome project illustrate the huge leap modeling can take when backed by rich data.
For these reasons I am suspicious of explanations by metaphor. They rarely contain fidelity or utility to the degree that an explanation from data does. The best explanations present a model, or description of a model that is grounded in data.