Searching...
English
EnglishEnglish
EspañolSpanish
简体中文Chinese
FrançaisFrench
DeutschGerman
日本語Japanese
PortuguêsPortuguese
ItalianoItalian
한국어Korean
РусскийRussian
NederlandsDutch
العربيةArabic
PolskiPolish
हिन्दीHindi
Tiếng ViệtVietnamese
SvenskaSwedish
ΕλληνικάGreek
TürkçeTurkish
ไทยThai
ČeštinaCzech
RomânăRomanian
MagyarHungarian
УкраїнськаUkrainian
Bahasa IndonesiaIndonesian
DanskDanish
SuomiFinnish
БългарскиBulgarian
עבריתHebrew
NorskNorwegian
HrvatskiCroatian
CatalàCatalan
SlovenčinaSlovak
LietuviųLithuanian
SlovenščinaSlovenian
СрпскиSerbian
EestiEstonian
LatviešuLatvian
فارسیPersian
മലയാളംMalayalam
தமிழ்Tamil
اردوUrdu
Five Lines of Code

Five Lines of Code

How and when to refactor
by Christian Clausen 2021 336 pages
4.01
82 ratings
Listen
Try Full Access for 7 Days
Unlock listening & more!
Continue

Key Takeaways

1. Refactoring improves code quality without altering its external behavior.

Refactoring—Changing code to make it more human-readable and maintainable without changing what it does.

Core definition. Refactoring is the disciplined process of improving code's internal structure and clarity without altering its observable external behavior. This distinction is crucial: the goal isn't to add new features or fix bugs, but to make the existing functionality easier to understand and modify. This focus ensures safety, as the system continues to perform exactly as before.

Economic benefits. The primary drivers for refactoring are economic. Programmers spend more time reading and understanding code than writing it. By enhancing readability, refactoring frees up valuable developer time, allowing for faster feature implementation. Improved maintainability also leads to fewer, easier-to-fix bugs, reducing long-term costs and making the codebase more enjoyable to work with.

Pillars of improvement. Effective refactoring rests on three pillars: communicating intent to boost readability, localizing invariants to improve maintainability, and achieving these without affecting any code outside the immediate scope. Localizing invariants means grouping related data and functionality, making it harder to inadvertently break assumptions about the data's state.

2. Adopt strict, actionable rules to guide your refactoring efforts.

A method should not contain more than five lines, excluding { and }.

Beyond code smells. Traditional refactoring often relies on abstract "code smells," which can be difficult for less experienced developers to identify and address. This book advocates for concrete, easy-to-apply rules, like the "Five lines" rule, which states that no method should exceed five lines of code. This strict limit provides a clear, objective target for improvement.

Simplicity and clarity. These rules act as "training wheels," offering a straightforward path to better code quality. While sometimes overly strict, they consistently push towards smaller, more focused methods. Each method's name becomes an opportunity to communicate intent, effectively replacing comments and making the code self-documenting.

Structured approach. The rules are paired with explicit, step-by-step refactoring patterns, ensuring that transformations are performed safely and reliably. This structured approach minimizes the risk of introducing errors, especially when relying on the compiler to catch mistakes, rather than extensive automated tests.

3. Eliminate conditional logic by embracing polymorphism and object-oriented patterns.

Never use if with else, unless we are checking against a data type we do not control.

Avoid hardcoded decisions. Conditional statements like if-else chains and switch statements represent hardcoded decisions that reduce flexibility and make code harder to extend. They force early binding, locking in behavior at compile time and hindering "change by addition." The rule "Never use if with else" pushes developers to seek more dynamic solutions.

Polymorphism as a solution. The preferred alternative is to replace type codes (enums, strings, integers) with classes and interfaces. The "Replace type code with classes" pattern transforms enum values into distinct classes, each implementing a common interface. Subsequently, "Push code into classes" moves the logic from the conditional branches directly into methods within these new classes.

Benefits of dynamic dispatch. This approach leverages polymorphism, where behavior is determined at runtime based on the object's type. Adding new functionality then simply involves creating a new class that implements the interface, rather than modifying existing if-else or switch statements scattered throughout the codebase. This significantly improves maintainability and extensibility.

4. Defend your data through rigorous encapsulation to localize invariants.

Do not use getters or setters for non-Boolean fields.

Protecting invariants. Encapsulation is paramount for maintaining data integrity and reducing fragility. By limiting direct access to data, we ensure that its invariants (assumptions about its state) are maintained within the class, making them local and easier to manage. The rule "Do not use getters or setters" for non-Boolean fields is a strict enforcement of this principle.

Push-based architecture. Getters and setters, while seemingly innocuous, break encapsulation by exposing internal data structures, leading to tight coupling and a "pull-based" architecture. Instead, a "push-based" architecture is favored, where computations are pushed as close to the data as possible. This means passing data as arguments or invoking methods on the data itself, rather than pulling data out to be processed elsewhere.

Encapsulate data pattern. The "Encapsulate data" pattern systematically moves variables and related methods into a new class, making them private and accessible only through well-defined behaviors. This process clarifies cohesion, simplifies method names, and often leads to more, smaller, and more focused classes, ultimately enhancing the system's overall robustness.

5. Collaborate actively with your compiler to enhance code safety and correctness.

If you’re the smartest person in the room, you’re in the wrong room.

Compiler as a teammate. The compiler should be viewed as an invaluable member of the development team, not an adversary. It performs crucial static analyses that guarantee certain properties of the code, catching errors that humans often miss. Leveraging its strengths can significantly increase software safety and reduce bugs.

Exploiting compiler strengths. Key compiler strengths include:

  • Reachability: Ensuring all code paths return or handle exceptions.
  • Definite assignment: Verifying variables are initialized before use.
  • Access control: Enforcing privacy and preventing accidental data exposure.
  • Type checking: Proving properties about data and method signatures.
    These capabilities can be used as a "todo list" during refactoring, to enforce sequence invariants, and to detect unused code.

Avoiding compiler weaknesses. Developers should avoid practices that undermine the compiler's ability to help, such as using casts, dynamic types, or unchecked exceptions. These practices effectively disable the compiler's safety checks, shifting the burden of correctness onto the developer and increasing the risk of runtime errors like null dereferences or arithmetic issues.

6. Cultivate a "less is better" mindset by relentlessly deleting unnecessary code.

Less is better.

Code as a liability. Contrary to popular belief, code is not an asset; it's a liability. Every line of code incurs maintenance costs, adds complexity, and introduces potential for bugs. The "sunk-cost fallacy" often leads developers to cling to code simply because it was expensive to produce, but true value comes from functionality, not investment.

Incidental complexity. Code accumulates incidental complexity through:

  • Technical ignorance: Lack of skill leading to poor design.
  • Technical waste: Shortcuts taken under time pressure.
  • Technical debt: Deliberate suboptimal solutions with an expiry date.
  • Technical drag: Inherent slowdown from growing codebase, documentation, or tests.
    The goal is to eliminate ignorance, waste, and debt, and minimize drag.

Ruthless deletion. The solution is to delete as much as possible, but no more. This applies to unused features, code, documentation, tests, configuration flags, and even version control branches. Techniques like the "strangler fig pattern" help identify and isolate legacy code for gradual removal, while "spike and stabilize" allows for rapid experimentation and deletion of unadopted features.

7. Overcome the fear of adding code; it enables safer changes.

If something is scary, do it more—until it isn’t scary anymore.

Embrace experimentation. Fear of failure, waste, or imperfection often paralyzes developers, leading to over-analysis and procrastination. Software development is about learning and codifying knowledge, and the most effective way to learn is through experimentation. This requires courage and psychological safety within the team.

Spikes and gradual improvement. The "spike" methodology encourages rapid, low-quality implementations to gain knowledge and confidence, with the explicit understanding that the code will likely be discarded. This shifts the product from "code" to "knowledge." Furthermore, adopting a philosophy of gradual improvement, rather than striving for unattainable perfection, maximizes practice and minimizes feedback loops, leading to faster learning and higher quality over time.

Modification by addition. Adding code is often safer than modifying existing code, as it avoids breaking existing functionality. This principle is leveraged through:

  • Code duplication: Initially duplicating code allows for safe experimentation and local changes, with unification occurring later if appropriate.
  • Extensibility: Designing for variation points (e.g., using strategy patterns) allows new functionality to be added as new classes without altering existing ones.
  • Backward compatibility: New features are introduced as new methods or API endpoints, preserving older versions for existing users.
  • Feature toggles: Using flags to enable/disable features allows code to be integrated and deployed safely, even if not yet ready for release.

8. Let code's inherent structure dictate your refactoring strategy.

Every block of stone has a statue inside it, and it is the task of the sculptor to discover it.

Code as codified structure. Software models aspects of the real world, and this real-world structure is inherently reflected in the code. Refactoring is akin to a sculptor revealing the statue within a block of stone, molding the code to expose and solidify its underlying structure. This makes the code more malleable for future changes that align with its natural evolution.

Three forms of behavior. Behavior can be embedded in code in three ways:

  • Control flow: Explicit loops, conditionals, and method calls. Easy for big changes, but can be fragile.
  • Data structure: Behavior encoded in the relationships and invariants of data types (e.g., a Binary Search Tree). Offers type safety and locality.
  • Data itself: Behavior encoded directly in values or references. Most difficult to work with safely due to lack of compiler support.
    Refactoring often involves moving behavior between these forms to optimize for stability or change.

Uncovering hidden structures. Developers often leave clues about perceived structure through:

  • Whitespace: Blank lines separating logical blocks of code or fields.
  • Duplication: Repeated statements, methods, or classes.
  • Common affixes: Variables or methods sharing prefixes/suffixes (e.g., playerX, playerY).
  • Runtime type inspection: Using instanceof or typeof to differentiate behavior.
    These "unexploited structures" are prime candidates for refactoring patterns like "Extract method," "Encapsulate data," and "Introduce strategy pattern."

9. Prioritize simplicity over premature optimization or excessive generality.

Simplicity is essential because humans have limited cognitive capacity; we can only hold so much information in our heads at one time.

Cognitive load. The core principle guiding all design decisions should be simplicity, aiming to minimize the cognitive load required to understand and maintain the code. Complexity arises from coupled components and numerous invariants, both of which are exacerbated by premature optimization and excessive generality.

Generality's pitfalls. Adding generality increases a component's potential uses, leading to more couplings and a wider range of scenarios to consider. This often makes the code more burdensome than helpful, as illustrated by the "Swiss Army knife" analogy. Generality should be introduced minimally, ideally emerging naturally from the "duplicate, transform, unify" refactoring process, and only when components have reached similar levels of stability.

Optimization's cost. Performance optimization relies on exploiting invariants, which adds cognitive overhead. It should only be pursued when performance tests explicitly fail, indicating a bottleneck. Before optimizing, code should be well-refactored to clarify invariants. Safe optimization techniques include:

  • Resource pooling: Optimizing concurrent systems by dynamically allocating resources to bottlenecks.
  • Profiling: Identifying "hot spots" where most time is spent.
  • Data structure/algorithm choice: Selecting efficient structures with equivalent interfaces.
  • Caching: Storing computed results to avoid recalculation.
    Any highly tuned code should be isolated and clearly marked to deter unnecessary future modifications.

10. Make bad code conspicuously bad to signal underlying process issues.

If you cannot make it good, make it stand out.

Signaling unsustainability. When constraints (like time or complexity) prevent code from being refactored to a "pristine" state, it's better to make it overtly bad rather than "good enough." This signals that the development process is unsustainable and draws attention to areas needing future work. It requires psychological safety within the team, where messengers of bad news are appreciated.

Segregating codebases. Intentionally bad code helps segregate the codebase into "pristine" and "legacy" sections. Pristine code, once clean, tends to stay clean (the "broken window theory" in reverse). Conspicuously bad code, on the other hand, constantly reminds developers of its poor quality, increasing the likelihood of it being addressed when resources become available.

Safe vandalizing techniques. Anti-refactoring must adhere to three rules:

  • Never destroy correct information: Preserve good names or add comments.
  • Do not make future refactoring harder: Suggest future refactoring steps.
  • The result should be eye-catching: Make the code stand out.
    Practical methods include:
  • Using enums or strings/ints as type codes.
  • Inlining magic numbers.
  • Adding comments that suggest method extractions.
  • Inserting whitespace to group statements or fields.
  • Grouping elements by common affixes.
  • Adding context to names (e.g., _ArrUtil).
  • Creating long methods by inlining.
  • Giving methods many parameters.
  • Using getters and setters (as a temporary step before pushing logic).
    These techniques make problematic areas immediately visible, guiding future improvement efforts.

Last updated:

Want to read the full book?
Listen
Now playing
Five Lines of Code
0:00
-0:00
Now playing
Five Lines of Code
0:00
-0:00
1x
Voice
Speed
Dan
Andrew
Michelle
Lauren
1.0×
+
200 words per minute
Queue
Home
Swipe
Library
Get App
Create a free account to unlock:
Recommendations: Personalized for you
Requests: Request new book summaries
Bookmarks: Save your favorite books
History: Revisit books later
Ratings: Rate books & see your ratings
250,000+ readers
Try Full Access for 7 Days
Listen, bookmark, and more
Compare Features Free Pro
📖 Read Summaries
Read unlimited summaries. Free users get 3 per month
🎧 Listen to Summaries
Listen to unlimited summaries in 40 languages
❤️ Unlimited Bookmarks
Free users are limited to 4
📜 Unlimited History
Free users are limited to 4
📥 Unlimited Downloads
Free users are limited to 1
Risk-Free Timeline
Today: Get Instant Access
Listen to full summaries of 73,530 books. That's 12,000+ hours of audio!
Day 4: Trial Reminder
We'll send you a notification that your trial is ending soon.
Day 7: Your subscription begins
You'll be charged on Jan 7,
cancel anytime before.
Consume 2.8× More Books
2.8× more books Listening Reading
Our users love us
250,000+ readers
Trustpilot Rating
TrustPilot
4.6 Excellent
This site is a total game-changer. I've been flying through book summaries like never before. Highly, highly recommend.
— Dave G
Worth my money and time, and really well made. I've never seen this quality of summaries on other websites. Very helpful!
— Em
Highly recommended!! Fantastic service. Perfect for those that want a little more than a teaser but not all the intricate details of a full audio book.
— Greg M
Save 62%
Yearly
$119.88 $44.99/year/yr
$3.75/mo
Monthly
$9.99/mo
Start a 7-Day Free Trial
7 days free, then $44.99/year. Cancel anytime.
Scanner
Find a barcode to scan

We have a special gift for you
Open
38% OFF
DISCOUNT FOR YOU
$79.99
$49.99/year
only $4.16 per month
Continue
2 taps to start, super easy to cancel
Settings
General
Widget
Loading...
We have a special gift for you
Open
38% OFF
DISCOUNT FOR YOU
$79.99
$49.99/year
only $4.16 per month
Continue
2 taps to start, super easy to cancel