Fundamentals 7 min read

Understanding Time Complexity and Big O Notation with Java Examples

This article explains the concept of constant‑time operations, introduces Big O notation for describing algorithmic time complexity, and demonstrates how to calculate and interpret complexities such as O(1), O(N), O(N²) and O(N²·logN) through clear Java code examples.

Selected Java Interview Questions
Selected Java Interview Questions
Selected Java Interview Questions
Understanding Time Complexity and Big O Notation with Java Examples

Constant‑time operations are those whose execution time does not depend on the size of the input data; each such operation completes within a fixed amount of time.

The time complexity of an algorithm is expressed using Big O notation, which captures the growth rate of the number of constant‑time operations as a function f(N) , ignoring lower‑order terms and constant coefficients, resulting in a notation like O(f(N)) .

The algorithm’s time complexity is written as O(f(N)) , indicating that as the input size N grows, the execution time increases according to the function f(N) .

To illustrate, we first identify the function f(N) that counts the number of constant‑time operations and then keep only the term with the highest growth rate.

If f(N)=c (c is a constant), the time complexity is O(1) .

If f(N)=a*N+b (a and b are constants), the time complexity is O(N) .

If f(N)=a*N^2+b*N+c (a, b, c are constants), the time complexity is O(N^2) .

If f(N)=a*N^2*logN+b*N+c , the time complexity is O(N^2*logN) .

Examples

public String test() {
    System.out.println("hello world"); // executed 1 time
    return "你好"; // executed 1 time
}

The above method executes two constant‑time operations, so f(N)=2 and the complexity is O(1) .

public int test() {
    for (int i = 0; i < N; i++) {
        System.out.println("hello world1"); // executed N times
        System.out.println("hello world2"); // executed N times
    }
    System.out.println("hello world3"); // executed 1 time
}

This code runs 2N+1 constant‑time operations, giving f(N)=2N+1 and a complexity of O(N) .

public int test() {
    for (int i = 0; i < N; i++) {
        for (int j = 0; j < N; j++) {
            System.out.println("hello world"); // executed N*N times
        }
    }
}

The nested loops execute N^2 operations, so the complexity is O(N^2) .

When conditional or sequential statements are present, the overall complexity is determined by the worst‑case (largest) term.

public int test() {
    // Loop 1: O(N^2)
    for (int i = 0; i < N; i++) {
        for (int j = 0; j < N; j++) {
            System.out.println("hello world"); // N*N times
        }
    }
    // Loop 2: O(N)
    for (int i = 0; i < N; i++) {
        System.out.println("hello world1"); // N times
    }
}

The combined complexity is O(N^2+N)=O(N^2) .

public int test() {
    if () {
        // Loop 1: O(N^2)
        for (int i = 0; i < N; i++) {
            for (int j = 0; j < N; j++) {
                System.out.println("hello world"); // N*N times
            }
        }
    } else {
        // Loop 2: O(N)
        for (int i = 0; i < N; i++) {
            System.out.println("hello world1"); // N times
        }
    }
}

Regardless of the branch taken, the worst‑case complexity remains O(N^2) .

Thank you for reading; hope this helps you understand time complexity.

Javaalgorithm analysisTime Complexitybig Oasymptotic notation
Selected Java Interview Questions
Written by

Selected Java Interview Questions

A professional Java tech channel sharing common knowledge to help developers fill gaps. Follow us!

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.