I can't remember when I was last so frustrated that a book could have been so brilliant... but then managed to cut out 95 per cent of its potential audience. Matt Cook's book promises to deliver '75 ingenious paradoxes in mathematics, physics and philosophy'. And it does. Some are familiar, from Russell's paradox to the Monty Hall problem, but quite a few weren't to me. I absolutely loved reading about the paradoxes. But. There's a big but.
The problem is that Cook does two things that make the book unreadable to many. One is to forget Richard Feynman's assertion that there's no point just learning labels for things. (Ironic, as Cook frequently cites Feynman, and even has a dedication that includes 'To Richard Feynman, who saved my father's life'.) Yet Cook insists on telling us all the technical language and what it means - which is totally unnecessary to explain the paradoxes. Who cares that something is called a bijection? We don't need to know.
Secondly, and even more significantly, rather than explain the paradoxes using words and simple illustrations, he uses mathematical and logical notation. He tells us there is a notation guide at the back, but this totally misses the point. No one who doesn't already know this stuff is going to bother - they will just be turned off. To be honest, this approach is lazy. It's perfectly possible to explain the paradoxes without resorting to technical notation - but if you're already experienced in the field it's easier to use the symbols and expressions rather than words. What I found was that even for the paradoxes I know well, understand and have explained to others, I found it pretty much impossible to follow Cook's approach - it just made it all far too complicated.
I was also a little worried about the frequency with which Cook quotes Ayn Rand - it felt a bit like a modern psychology book quoting Freud - but that's perhaps just a personal or UK versus US preference.
Overall, then, I was so frustrated. There is a brilliant book inside here, trying to get out - but it is confined by the jargon and notation to an extent that I suspect this book will only appeal to those who are already well-versed in the appropriate methods and notation. Such a shame.
Hardback:
The problem is that Cook does two things that make the book unreadable to many. One is to forget Richard Feynman's assertion that there's no point just learning labels for things. (Ironic, as Cook frequently cites Feynman, and even has a dedication that includes 'To Richard Feynman, who saved my father's life'.) Yet Cook insists on telling us all the technical language and what it means - which is totally unnecessary to explain the paradoxes. Who cares that something is called a bijection? We don't need to know.
Secondly, and even more significantly, rather than explain the paradoxes using words and simple illustrations, he uses mathematical and logical notation. He tells us there is a notation guide at the back, but this totally misses the point. No one who doesn't already know this stuff is going to bother - they will just be turned off. To be honest, this approach is lazy. It's perfectly possible to explain the paradoxes without resorting to technical notation - but if you're already experienced in the field it's easier to use the symbols and expressions rather than words. What I found was that even for the paradoxes I know well, understand and have explained to others, I found it pretty much impossible to follow Cook's approach - it just made it all far too complicated.
I was also a little worried about the frequency with which Cook quotes Ayn Rand - it felt a bit like a modern psychology book quoting Freud - but that's perhaps just a personal or UK versus US preference.
Overall, then, I was so frustrated. There is a brilliant book inside here, trying to get out - but it is confined by the jargon and notation to an extent that I suspect this book will only appeal to those who are already well-versed in the appropriate methods and notation. Such a shame.
Hardback:
Comments
Post a Comment