Large things done in small quantities lead to great things

Xander Reynolds
5 min readJan 19, 2021

--

I was reading a book recently about DevOps (The DevOps Handbook: How to Create World-Class Agility, Reliability, and Security in Technology). DevOps is not the most exciting topic, but it is important.

The point of this blog is not to talk about DevOps, but an idea that was mentioned in the book.

There was an example about manufacturing. Let’s say you had to mail 200 letters. How would you do this? There are a few options, but they would all contain assembling the envelopes, putting the stamps and addresses on the envelopes, and finally shipping them. There are different ways to do this: Two of possible approaches:

  1. Focus on one task at a time:
  • Fold 200 Letters
  • Stuff 200 envelopes
  • Put stamps & Address on 200 envelopes
  • And finally, Mail the letters.

2. Focus on completing 1 Envelope at a time:

  • Fold 1 Letters
  • Stuff 1 envelope
  • Put stamps & Address on 1 envelope
  • Repeat 200 times
  • And finally, Mail the letters

When I read this, my thoughts were — it’s obvious, he first approach would be the most efficient way to do it! When you focus on one task, you can develop a routine and efficient way to accomplish the task because of the repetitious nature. Then I read more.

But think about this scenario. What if you finish the first envelope, and realize there is a mistake? For example, there is a mistake on the address.

  • How long would it take to realize the mistake?
  • How much waste would you have?

For the First approach, you have to complete the first letter. This means that 200 envelopes were stuff, and put the first stamp on, before realizing the mistake. This is not so bad, but what if the mistake would cause rework of stuffing 200 envelopes? That would be a big waste. In Addition, it took 200 stuffed envelope to find the mistake.

Let’s contrast this with the second approach. The letter is put in the envelope, and when the stamps are placed, the mistake is found. The error was found on the first letter, and can now quickly be fixed. The lesson here is:

  1. Smaller quantities of completed work (notice the word completed)
  2. Quick discovery of mistakes AND fixes (keyword AND!)

As a result of this discussion, it made me reconsider my idea of how to “efficiently” accomplish work. I realize these concepts can be applied in other ways too.

One key is completed. This means to do something from the start till the end. Sure, it can be easier to start many things and wait to finish all the tasks at once. But, when this process is completed, you gain some insights that are not possible to know otherwise. This is where the discovery, and fix is applied for the next sequential process.

First Real life Example

There have been many times I had a great idea and started with the implementation. After several hours, I realized — Oh, this will never work! The worst part is, there was a very quick way to realize this.. what a waste! Again, the idea is, doing something small, but make it complete, and learn. It’s a quick feedback loop.

The better way to do this. Have an idea. Think about the main parts that would need to change to fit the software. Do a quick test to see if these parts can do what is needed. Then, and only then, start with the implementation.

Another Real Life Example

During the month of December most of my team was on vacation (lucky me, I get to hold down the fort). During this time, there were some issues with the version control and I was not able to access this remotely (and the office was closed). For a month, I developed software without being able to merge any of my changes. I was able to locally commit with Git, but I could not merge to our integration branch.

After a month, everyone returned to the office, and the version control was restored! Hooray! but now… I have a month of work to merge…. :(

I checked my git history and there were 80 files to merge! Agh, so many!! This would definitely cause some headaches.

My normal approach would be — Okay, one big merge.. this is gonna be painful, but then it’s done! I know there will be many non-trivial merges, but I can work through them slowly.

Then, the lightbulb came on — Small pieces of completed work.

The merge process looks like this:

  1. Merge the files to the integration branch
  2. Verify the team build is successful after the merge
  3. Fix any mistakes that would cause the merge to fail.

And again, my options are:

  1. One Big merge
  • Merge ALL files.
  • Verify the team build runs (It definitely will not)
  • Fix mistakes (Where to start??)

2. Many Small Merges

  • Break the merge into smaller pieces
  • Merge one small component
  • Verify the team build
  • Repeat until finished.

Let’s try this. I broke down my last month’s work into smaller topics. As I worked, I created new branches and tags where each mini topic started/ended. I put tags for each merge with a name describing a basic title for the section:

Merge1_Updates_for_vcx_proj — a small merge to fix the unit test build that no longer worked.

Merge2_Axis_Position_widget_refactor — some refactoring in regards to a widget that is used in our team,

Merge3_ — was a new feature that was added.

….

Merge_8 — Some bugs that were found after extending our test cases

I found 8 smaller topics that I could merge. Now my large merge, was broken into smaller pieces that were more manageable.

The breakdown was as follows.

Merge 1: 1 File
Merge 2: 31 files
Merge 3: 25 files
Merge 4: 11 Files
Merge 5: 11 Files
Merge 6: 18 Files
Merge 7: 8 Files
Merge 8: 16 Files

So I prepared for the first merge, I checked out a few files, and already something looked strange. I found a mistake made by someone else, and I fixed this.

Okay great, moving on. On Merge 3 and 4 I found some colleagues had also changed some software that would cause my changes to not compile. Argh!

BUT!

Because the merges were small, I could find EXACTLY what caused the issue and fix it. This still took a long time, but tell me, what do you prefer?

80 non trivial file merges or 8 trivial merges of 15–25 files?

This is a no-brainer. Every non-trivial file merge is a chance for a mistake.

I was also able to verify that our team build would work with each “mini” merge. In this process I also found a few missing includes. This would have been much harder with 80 changed files and now the build fails. What is the problem now?

I am convinced this is the way to proceed:

  1. Smaller quantities of completed work (notice the word completed)
  2. Quick discovery of mistakes AND fix (keyword AND!)

Have you ever applied these ideas to your work?

--

--

Responses (1)