Pages

Wednesday, December 17, 2014

Refactoring running rampant

The purpose of refactoring is: Increase the quality of code without changing the functionality of the code. Unfortunately, we too often forget that "quality" is in the eye of the beholder.

Let me give you one example. This is pseudo Java code snippets, but you can see where it leads.


Originally, there were 2 classes which contained code like this:

class A
int X;
int Y;
public int compute ()
{
   return this.X + this.Y;
}
class B
int E;
int F;
public int compute()
{
  return this.E - this.F;
}

Well, those are highly similar methods, so "obviously", this calls for refactoring.

Let's start refactoring to a common level:



class A
int X;
int Y;
private int _compute ()
{
   return this.X + this.Y;
}
public int compute ()
{
   return this.compute();
}

class B
int E;
int F;
private int _compute()
{
  return this.E - this.F;
}
public int compute()
{
   return this.compute();
}


We have some duplicate code now, so we want to eliminate it by moving the method into a new class:


abstract class operatorClass
public int compute ()
{
   return _compute;
}
class A extends operatorClass
private int _compute ()
{
   return this.X + this.Y;
}
class B extends operatorClass
private int _compute ()
{
   return this.E - this.F;
}

At least, now there is a common level, but there is still highly similar code. Let's do something about it.



class A extends operatorClass
A()
{
  setOperator ("+");
}
class B extends operatorClass
B()
{
  setOperator ("-");
}
abstract class operatorClass
int X;
int Y;
String operator;
public int compute()
{
   switch(operator)
   {
      case "-" : return X-Y; break;
      case "+" : return X+Y; break;
      default: throw new InvalidOperationException (operator); break;
   }
}
protected void setOperator(String operator)
{
   this.operator=operator;
}


Yay! We eliminated the "nearly duplicate" method in 2 different classes and reduced the amount of code in both of them.

Unfortunately, there is a small fly in the ointment here:

  • Case-Statements are poor code. Reason? They are more difficult to unit-test. Path coverage comes to mind. It's also a violation of the Single Purpose Principle.
  • The "operatorClass" is now doing stuff it shouldn't be doing, that is: making a decision that should be made on a more abstract level, i.e. on the level when the object is created - a violation of the Dependency Inversion Principle!
  • We actually introduced the possibility for error into the operatorClass' "compute" method. Trying to call "compute" with invalid operators was not possible before!
  • Oh, and that, of course means, we need additional Exception Handling. We didn't even go into the new "InvalidOperatorException" class that we must create.
  • Each time we implement a new class that implements a new "compute" method, we must modify the "operatorClass", so we just violated the Open/Closed Principle!
  • Not to mention the application's performance has just deteriorated. It will now be slower, because of the "case" statement that must be evaluated. It will also consume more memory, because an additional variable must be initialized
Summary:
While the clode looks cleaner when you only look at the level of A and B, we merely "shoved dirt under the rug", but we didn't help at all!

Lesson learned

Refactoring is not a purpose in itself.
Not every refactoring is actually a positive change.
When refactoring, you must set a clear purpose of what you want to accomplish - and why. Even when you do not break (unit) tests and the code becomes shorter, you may be doing something tremendously harmful.
I strongly advise doing Code Katas occasionally to get a grip on how to refactor beneficially.

No comments:

Post a Comment