Granularity

Granularity

Executive Summary

Excessive precision is not only useless but harmful because:

a. It leads to fundamentally incorrect decisions, such as assuming 5.0001 > 5.0000, despite both values having 10% precision and 60% confidence—making the fourth decimal place meaningless.

b. It slows down execution to the point of uselessness and bloats code beyond the memory capacity of processors.

Executive Summary

Excessive precision is not only useless but harmful because:

a. It leads to fundamentally incorrect decisions, such as assuming 5.0001 > 5.0000, despite both values having 10% precision and 60% confidence—making the fourth decimal place meaningless.

b. It slows down execution to the point of uselessness and bloats code beyond the memory capacity of processors.

Problem:

  • The number of possible values for any parameter is always finite, and we know them with a certain precision and confidence—this was already discussed in the section 1+1 ≠ 2.

  • It is critical to understand that analytical formulas, which are “correct” and “precise”, and account for events with a probability of 0.00001%, are not only useless but harmful—especially when our input data is only 80% reliable or has a precision of 0.15%.

    • It is important to clearly recognize this harm.

    • The logic of “since our input data is uncertain, let’s at least calculate everything else with absolute precision” is fundamentally flawed because:

      • It wastes resources at every level:
        • Instead of 1 byte per variable → 4 bytes.
        • Instead of a simple loop → a complex module.
        • Instead of 100 bytes per program → 10 kB.
        • As a result, instead of 0.1 seconds per computation (imperceptible to a human) → it becomes 1 second, making the system unusable.
  • This is a classic trap of quantity turning into quality.

    • This transition is not just abstract philosophy, something “nice in theory but hard to apply to code.”
    • It is a fundamental law that must guide both code and system design—from the very foundation.
    • When applied correctly from the lowest level, everything becomes simpler and more natural.

Key Takeaways and Practical Steps

  • Granularity should initially be chosen randomly, just like anything else without prior history.

    • It can start with some buffer and shrink if unnecessary.
    • Or it can start coarse and refine itself until reaching the point where “the effort is not worth the gain.”
  • Avoid excessive literalism.

    • If we previously measured current in kA, but later discovered it operates in microamperes, we should not suddenly switch to kV.
    • Yes, randomness plays a role, and we could start from either extreme, but we almost always have some prior experience.
    • The system should extract common sense patterns:
      • Amperes are associated with volts.
      • Microamperes are (probably) associated with microvolts.
    • However, the system can never be certain—as proven earlier, absolute certainty is only ever an accident.
  • “The effort is not worth the gain” is a balance of cost vs. benefit.

    • The system should automatically assign cost to everything and minimize it whenever possible.
    • If increased precision still yields more benefit than cost, granularity should increase—more digits, finer time resolution, etc.
    • However, saturation must be a core principle of system design—beyond a certain point, more precision stops being useful.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top