Architecting and implementing software is a constant balancing act. First the design has to work in the end but the underlying implementation can vary greatly, event during development.
Re-use existing classes that almost fit, or create a new or derived hierarchy for just what is needed. Abstract out the needs in base and helper / utility classes, but not too complex to not allow modification for the ever changing requirements. Is the abstraction clear to others that will be maintaing the code, months or years later? Should the abstraction only existing for the main feature / operational aspects of the application or should they extend to the smaller aspects of the operations as well.
It's a constant battle of decisions during implementation.
The abstraction point is not isolated to development as the cost associated with these decisions can increase the code and timeline of the implementation and even place a project at risk.
During sprints for adding new features there is a temptation to refactor classes to make better use of abstraction, re-use of code and testing, But this comes with a cost. Too much refactoring can lead to instability of code (yeah I know the tests should catch errors, theory -vs- practice), lengthening deadlines and slowing of finishing the project as the more features are completed. There is a balancing act to the refactor / abstraction dance.
Over the years, I've found my balance point of when to abstract and when not to. My goal is to have a fare amount of abstraction but not to where the different logic hits the metal (like UI behavior) to occur. Basically it's not to do abstraction when it feels forced or where complexity starts to build because of the abstraction. The larger the abstraction granularity (base classes, MVC, command patter, etc) the lower the complexity. The finer the abstraction the risk of increasing the complexity and the risk of harder to maintain, refactor, etc. You can sometimes see this when there are abstractions upon abstractions, that's a red flag of complexity.
It's a tough balancing act to follow.
Re-use existing classes that almost fit, or create a new or derived hierarchy for just what is needed. Abstract out the needs in base and helper / utility classes, but not too complex to not allow modification for the ever changing requirements. Is the abstraction clear to others that will be maintaing the code, months or years later? Should the abstraction only existing for the main feature / operational aspects of the application or should they extend to the smaller aspects of the operations as well.
It's a constant battle of decisions during implementation.
The abstraction point is not isolated to development as the cost associated with these decisions can increase the code and timeline of the implementation and even place a project at risk.
During sprints for adding new features there is a temptation to refactor classes to make better use of abstraction, re-use of code and testing, But this comes with a cost. Too much refactoring can lead to instability of code (yeah I know the tests should catch errors, theory -vs- practice), lengthening deadlines and slowing of finishing the project as the more features are completed. There is a balancing act to the refactor / abstraction dance.
Over the years, I've found my balance point of when to abstract and when not to. My goal is to have a fare amount of abstraction but not to where the different logic hits the metal (like UI behavior) to occur. Basically it's not to do abstraction when it feels forced or where complexity starts to build because of the abstraction. The larger the abstraction granularity (base classes, MVC, command patter, etc) the lower the complexity. The finer the abstraction the risk of increasing the complexity and the risk of harder to maintain, refactor, etc. You can sometimes see this when there are abstractions upon abstractions, that's a red flag of complexity.
It's a tough balancing act to follow.
Comments
Post a Comment