Rising to the bait again (to keep the fishy metaphor going), there's yet another discussion of how the ability to hack the runtime changes the world. At one level that's true, but not in this case.
Dependency Injection is not a recent invention that a cabal of TDDers forced on the world, and it's absolutely not something that we came up with just so we could crowbar in our Mock Objects.
The relevant design guideline, named the Law of Demeter ("Strong Suggestion" isn't very catchy) was first described by Karl Lieberherr in 1987! The Mock Object pattern came from people applying their substantial O-O experience to TDD in Java, trying to figure out how best to avoid exposing implementation details in unit tests. One of the critical lessons from Demeter is that objects should have explicit dependencies. It helps to keep them focussed on their responsibilities and, as a result, easier to maintain—and a good way to make dependencies explicit is to pass them in.
Unfortunately, since then the world seems to have filled up with DI frameworks which cloud what should be a coding style discussion. This is not about having to configure every last corner of your application in XML, this is about how objects, or small clusters of them, get to talk to each other.
Roy asks, "[...] do you need DI all over the place, or just in specific places where you know dependencies could be a problem?" Well no—if you have enough foresight to know where those places are. I'm struggling at the moment to test against a framework where everything is helpfully packaged up nice and tight so I can't create an instance of one of its core objects. It's actually well written, but the authors just weren't good enough at prophecy to figure out my particular need. That's why I don't rely just on my intuition, I use the needs of my unit tests to help me figure out where the seams should be. To counter the FUD argument, I have absolutely no problem with saying that I don't want tools that do magic because I need guidance with my code.
As Roy (very politely) concludes, there isn't a high enough proportion of really good code in the world (some of it mine) that we should be in hurry to cut back on techniques such as DI. Just because something has been around for a while doesn't mean it's been superseded, especially in as conventional an environment as .Net.

