Evaluating Existing Codebases

All Articles AI Culture Data Management Level 12 News Python Salesforce Software Development Testing

"It works most of the time - we just need some updates done here and there.”

"We acquired two programs and want to combine them into one."

Shouldn't be too hard, right? After all, it's just building on what already exists.

As engineers, we don't know what we don't know, and we're allergic to assumptions. We can't start project timelines and estimates until we know the code.

We need to begin by evaluating the existing codebase. The main goal in this assessment phase is to understand the level of effort required for basic development tasks, refine estimates, and identify things that may need done before working on

Benefits of Evaluating Code

Create Realistic Timelines

When we develop our own code based on our common toolchain, we already know how long certain tasks should take.

Something analogous would be booking time at the auto repair shop. If your car needs an alternator, the mechanic can look up the agreed-upon time for changing it and charge for the service accordingly.

Software development always has unforeseen complexity that any estimate needs to account for. However, with enough experience, we can attenuate our estimations to what we've seen with the benefit of hindsight.

We don’t have that experience when we step into new-to-us codebases of unknown origin. Our evaluation intends to determine a coefficient of sorts that we can use to inform our more general estimation process.

Identify Strategic and Necessary Options

Along with aligning shared expectations between our team and clients, our evaluation allows us to recommend specific paths forward with the code.

Not all codebases are created equal. Some are well-engineered, have great structure and tool choices, and do not require much maintenance effort. As you might guess, this is a rare case.

We often find several red flags in other codebases that should be resolved before work can commence on the client's desired features and updates.

Sometimes, a codebase has gone through several teams’ hands before ours, and the mix-and-match of paradigms is evident.

Key Assessment Areas

So, what do I look for when new code crosses my desk? I have three goals:

  • Understand the general structure and expected tools
  • Step through parts of the code as if I were performing a code review for someone on my team
  • Identify any security issues

Architecture and Tools

The general structure is important because if we maintain this code, I need to know how to introduce it to the team member who will primarily work on it.

Fundamental questions all apply here:

  • What's the language?
  • Is the language being used well?
  • What are the steps to start development?
  • What framework is in use?

Code Quality

The code review step allows me to read a code sample with an eye toward detail.

Most often, we can't review the entire codebase in detail because the time required would be prohibitive. A developer needs to work in the codebase for a while to get a handle on the layout. Depending on the size, getting a good idea of all the pieces could take weeks to months.

We don't have time for that in a time-boxed evaluation, but we can get a general sense of how well-organized the code is.

For example, we can determine if a test suite exists. If it does, does it pass? Looking at a sampling of tests present, do they test what we expect them to test?

Security Vulnerabilities

To round out the assessment, we look for common patterns that would introduce security vulnerabilities. Patterns like SQL injection or cross-site scripting injection can be fairly easy to spot if you know what to look for.

If we don't see anything obvious, we can report to our client that we did that due diligence. That doesn't guarantee the code has no security issues, but we have some confidence that the previous developers thought of it.

The Result: Recommendation

At the end of the evaluation, we meet with the client to review the results. Generally, we can provide some recommendations to move forward well. And we'll have some practical experience to back that up rather than just guesswork.

Originally published on 2025-04-23 by Matt Lewellyn

Reach out to us to discuss your complex deployment needs (or to chat about Star Trek)