This is really interesting to me because I find it to be one of the problems I often see floating around in my head that there are no simple answers to. I mean, I know, there are very few simple answers to anything, but this specifically has two simple paths and both of them have plenty of cases justifying their positions, and neither are inherently wrong just at first glance.
Supporters of expertise will notice the extremely obvious fact that the world is populated by a very large excess of people who are stupid, it is thus important that we are capable of gauging the capacity of a given speaker to speak about the subject matter in question that they're currently addressing. Case in point, when the new guy decries the inefficiencies of subversion by saying something along the lines of This is really dumb, most listeners lent less credence to his arguments than the video lecture by Linus Torvalds about why subversion was, in fact, quite stupid.
The fact that at this point the innate reaction is to say something along the lines of Yeah but despite what he said, you know, subversion does actually get the job done a large amount of the time, and if you're in an environment working with a bunch of windows based coders who are intimidated by the command line, setting up a version control system based on git is actually just as stupid as Linus' points about git being better than subversion make subversion appear to be, tortoise svn is the clincher, and telling something like that to a person like Linus would just make him laugh at you. And you'd be pretty much right on the money with that summation of the situation, and although a genuine expert would indeed laugh at you, git would still be a bad fit for the exact reasons you raise.
Within our domain, it is important to be able to research and critically analyse complex situations independently and come to something of a balanced and well thought out conclusion with regards to issues of such complexity that if you were to try and explain them to someone from before the dawn of civilisation, you may as well be talking to an alien. We all exist and operate within this space as subject matter experts to lesser or greater degrees based on our abilities to bootstrap our grasp of a problem from the entire expanse of human knowledge.
I was reading a blog post by Steve Yegge not long ago, which I thought was a really interesting summation of the entire situation, he was talking about the acceptable level at which you could safely rely upon an innately leaky abstraction as just magic. Amusingly enough he placed this level of abstraction at just below the point where he understood and admitted that he didn't really get how stuff worked at a transistor level, but if anyone wanted to argue with him about the importance of knowing raw java rather than just using J2EE they'd be in for a fight to the death, or recollecting more models from Design Patterns than the singleton, etc. Despite Steve's dismissal of comprehension of this level of abstraction, it is, indeed, actually critical stuff to know under certain circumstances.
I often hear nowadays of the fact that the entire length and breadth of human knowledge is simply too vast to store within memory, and you cannot become a subject matter expert on every single thing that there is that humans have discovered and abstracted in the history of civilisation. This is self evidently true, and yet having that depth of knowledge is in one sphere or another entirely critical to the sphere in question. The solution, in my opinion is to abandon our vaunted reliance on field expertise as rote memorisation, rapid calculation, or precise simulation, even at the generalised theory level. All three of these things computers do far better than any of us, and they should be used when these things need to be done.
I had a job interview with google, and have read of many job interviews conducted by google, where they've almost disqualified candidates on the spot for writing a prototype c program in the interview and not spotting a memory leak immediately, or because they could not instantly recall the precise amount of blocks in an inode created by mke2fs in distribution X. Stuff like this is the symptom of the disease that this entire situation is so emblematic of. They have a word for people that dedicate a disproportionate amount of mental resources to the rote memorisation or ignoring the forest for the trees based reasoning that is a hallmark of the aforementioned situations, idiot savant, autistic, etc.
It's particularly amusing behaviour coming from the very company that makes such skills largely irrelevant. Mark Cuban summed it up pretty well when he said an excellent memory used to be worth something, but now we just google it. If you're relying on your encyclopaedic knowledge of domain x without reference checking your critical decisions each and every time and making sure that your underlying assumptions are entirely valid, sooner or later you're going to make a mistake that someone who does that would not have made. And no, it doesn't make you immensely faster or more effective to do so, because the cognitive abilities that you sacrifice to this rote memorisation exercise tends to contribute to an impairment of your ability to quickly and effectively conduct a complete analysis of the entire problem on the spot building all the information from nothing and making sure it is entirely valid and applicable at the exact time you're doing it. Idiot savants, absent minded professors and general autistic tendencies are illustrative of exactly what I'm talking about.
We, especially as coders, but arguably as an entire species, are ideally, no longer purely biological entities when it comes to addressing problems. We have a wealth of experience to draw from, both personal and external with regards to what has worked in similar domains in the past. We do not need to remember every keyword and function call within an entire language by rote to be effective coders, we do not need to remember every object oriented design paradigm to be effective coders, and we absolutely, positively, do not need 100% reliable working c compilers embedded in our wetware in order to be effective coders, nor need to memorise the precise amount of blocks in an inode created by mke2fs on distribution x version y. But it can very much help toward the goal of being effective coders if we can quickly and accurately gain access to all of the prior information and an indefinite amount more as quickly and easily as possible.
And this, I believe, is the true role of expertise, understanding what the important variables are, and being able to quickly and reliably fill them in with all due respect to the specific domain of the problem in question. I will take someone that can do that over a person who has memorised less than a percent of what could reasonably be stored in a terabyte of space, on any modern software project. Compete in the sphere in which we excel, none of us will ever outmatch a hard disk in a memory contest, nor execute more instructions per second than a modern CPU.
Disclaimer; This is only my opinion, and I do fully admit that I could be wrong, maybe these things are in fact critical and I am in fact simply stupid and the world will keep turning without my ludicrous opinions, thank you very much. And with regards to the specific examples I gave, I have a ton of respect for both Steve Yegge and Google, despite my disagreeing with them on this particular issue, it is not my intent to point and laugh at all, merely to illustrate that some overall very clever people and organisations are behaving in some small way which under closer examination, are maybe not all that clever. The fact that Steve has started pushing the virtues of dynamic languages and google doesn't insist everything be done in assembler gives me hope for the future.