What Ruby's zero? method can teach you about object-oriented programming
Did you know that Ruby has a method called
zero? Check out the Ruby documentation and search for it… you’ll find several results.
Why does Ruby have this method? I’m not 100% sure, to tell you the truth… I guess it’s just useful to check for zero sometimes :) But we can learn a lot from it!
First I want to show you a common trick when doing a comparison. People with a C/C++/Java background tend to use it… or just people who have been coding a while :)
Let’s say you want to compare a variable. You’d write something like:
but it becomes REALLY easy to make a little typo:
Do you see the typo? I’ve accidentally assigned the variable
count_people to 0, instead of comparing it. The trick is to flip the comparison:
because now if you accidentally use an assignment instead of a comparison, you get an error:
The key here is that when I make a mistake, Ruby raises an error and lets me know. Without the error message, I introduce a bug in my code. Yikes!
zero? uses a similar principle – Ruby will raise an error for any object that doesn’t respond to it:
If I did a simple comparison, each line would return true or false… which in the case of a string or nil is probably the wrong thing. And there’s a chance that the incorrect value raises an error later, but I might have already updated a database value or sent an email at that point. Much better to find the problem and raise an error sooner.
zero? encapsulates a tiny bit of logic – the comparison to 0 that you’d expect. By doing so, it adds intentionality to your code. It’s not about readability. It’s about expressing the domain in code, and letting the Ruby interpreter give you helpful feedback.
Where do you go from here?
Well it’s pretty simple… write methods that express your intent. When you do that, you get a safety check from the Ruby interpreter, and you introduce flexibility through polymorphism.