Well, the definitions I found did not talk about "automated" being part of the description.
If that's the case, "unit testing" is just testing with an extra adjective to make the speaker/writer sound more intelligentsavvyup-to-date a thrall to style. Something management will shake their head to, in agreement, not wishing to appear ill-informed.
Just like the moronic overuse of the work "issue" when a real word with an actual commitment to intentions, is readily available.
Examples by trivialization or determining how it might fail - really don't prove any point. Perfection may not be achieved, but at least one does a true 'best efforts' or 'due diligence' (pick the one you prefer).
I prefer finding problems before the users do. Is it 100% effective? Not quite! But reliability has engendered trust in my "product". Embracing accountability, should I miss one, takes care of what I miss.
This place where I work does between $400 and $500 million/yr in gross business - a huge amount of it passes through my application framework. How much is a "tiny" error worth to them vs. my time?
Sander, I agree.
I do some unit testing. For classes and their methods that are easily testable, either because they are generic, have no references to anything or do simple "toolbox" stuff, like string helpers, unit converters, those kind of things.
It's fast and in many cases more easy to write a quick test then to set up some console output in the main() of the program just to call that stuff and see what it does.
but I don't force anybody to run those tests with every compile, hence many of the test never run again... They are a replacement for Console.writeline in the main program.
I am also not a big fan of that religious mvvm drama...
when we need a new value from the ui passed down to some network layer because we need to deliver that to the server and I hear an answer of "oh wow... thats hard, takes 3 or 4 days, we need several new commands here and bindings there and dont-know-what there and it will affect about 50+ unit tests..."
OMG... go away! just send that f*cking integer value over the line. if such a simple change causes DAYS of work... then this is not a programming model I want to work with. period.
I have never used any mocking framework or dependency injection...thing... man! wake up! it's just the next bunch of frameworks and libraries and biiiig big stuff you never saw, which will introduce new bugs, have it flaws... creates more troubles than it solves... you are totally giving up control of what happens in your OWN program!
that blind religious behavior is something I really see as a critical thing. Those people are trusting any totally unknown developer any some library more than their own ability to write stable, working code.
And then, that wonderful day comes where you see tears in the eyes of some developer "i don't know why it is not there! i have all the bindings, the DI is set up and all unit tests are green. I know my mocks are right in the tests. it MUST work!"...
Then I get me some coffee, maybe a coke, and don't forget the popcorn! Lean back and watch them cry. When that fine moment comes where they realise that they have totally given up control just to follow some strange ..."model"... because "it's state of the art to do things like that"... oh man...
i have apps in store (professional and with my own label), i write games, i never used such alien-stuff and all my software works fine. Sometimes I think, many of those hyped models are a doubtful try to make development easier, to have people stop thinking. because the unit tests find it anyway. dont think about object hierarchies, stop good object design, just push 15 interfaces to a constructor and write 20 factories. and at some place, somewhere, anybody will create finally a living instance of IAnything and push it to ISomething, where IDontKnowWhat will do ISee...
I write (generic) platform modules, I have in most cases MORE reusable code than the mvvm extremists but it feels like being Don Quichote sometimes...
not my world. will never be.
When we write a new software I write unit tests at least for the essential parts and try to convince the team to do the same.
(And when I'm with the estimation team I add unit tests to the items to be calculated.)
But when we add something in old, grown software there's no way to get weeks of additional budget for unit tests.
And I don't see much sense in writing unit tests covering 0.1% of the code.
It's the process of debugging your application by means of getting the compiler to produce an EXE file.
Once it stops issuing ERROR messages (WARINGS are fine, you can ignore them) you're ready to ship!
"Unit" testing is just an Acronym for this process: "yoU Need It Tuesday"
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
Yes, I remember during the good old FORTRAN or C days, they used to say that each subroutine or function is to be tested by giving several valid inputs, and several more invalid inputs, and then the response seen.
For example, for a numerical argument, give input within range and at the boundary (for valid inputs), and outside range, one-off inputs, alphanumeric, decimal, integers, negative numbers, etc. (for invalid inputs).
Similarly for strings.
Faintly remember that it used to be called Equivalence Group Testing. Quite fail-proof. Not sure whether it is still being used.