Transforming a Simple Factorial into a Concurrent Go Engine
This article explores how a basic factorial calculation can be turned into a sophisticated concurrent system in Go, covering deadlock pitfalls, fan‑out/fan‑in pipelines, Redux‑style state management, and a controllable streaming factorial engine with pause and resume capabilities.
🧭 1. From Multiplication to Concurrency: Awakening a Simple Problem
Factorial is a trivial math exercise, but in Go it becomes a gateway to concurrency concepts.
func factorial(n int) int {
if n == 0 {
return 1
}
return n * factorial(n-1)
}The above implementation is correct but single‑threaded and does not showcase Go's concurrency strengths.
"Computation is never single‑threaded; it is a game of coordination and competition."
🧨 2. Deadlock Hell: Misusing Channels
Using channels for concurrent factorial can easily lead to deadlocks when the channel is closed prematurely.
func factorialDeadlock(n int) int {
ch := make(chan int)
for i := 1; i <= n; i++ {
go func(i int) { ch <- i }(i)
}
close(ch) // ❌ premature close
result := 1
for v := range ch {
result *= v
}
return result
}
// fatal error: send on closed channel"Controlling the channel lifecycle is the first lesson in concurrency."
🔍 How to Avoid Deadlock
Let the sender control when to close the channel.
Use sync.WaitGroup instead of assuming a safe close.
Refactor into a fan‑out/fan‑in pipeline.
🧩 3. Fan‑Out/Fan‑In: Making Computation Flow
In Go, a channel is the water flow and a goroutine is the pipe. Fan‑out/fan‑in splits work, runs it in parallel, then merges results.
func worker(id int, jobs <-chan int, results chan<- int) {
for n := range jobs {
fmt.Printf("Worker %d processing %d
", id, n)
results <- n
}
}
func fanOutFanInFactorial(n int) int {
jobs := make(chan int, n)
results := make(chan int, n)
for w := 1; w <= 3; w++ {
go worker(w, jobs, results)
}
for j := 1; j <= n; j++ {
jobs <- j
}
close(jobs)
result := 1
for a := 0; a < n; a++ {
result *= <-results
}
return result
}This does not speed up the mathematical multiplication (which remains sequential) but releases CPU parallelism at the scheduling level.
"Performance may not improve, but the architecture becomes more elegant. This is Go's concurrency aesthetics."
⚙️ 4. Redux‑Style State Management in Concurrency
Borrowing the Redux idea from front‑end development, we model factorial as a state machine where each multiplication is an action.
type State struct { Value int; Step int }
type Action struct { Type string; Data int }
func reducer(s State, a Action) State {
switch a.Type {
case "MULTIPLY":
s.Value *= a.Data
fmt.Println("action:", a.Type, "step:", s.Step, "value:", s.Value)
s.Step++
}
return s
}
func factorialRedux(n int) int {
state := State{Value: 1, Step: 1}
actions := make(chan Action)
go func() {
for i := 1; i <= n; i++ {
actions <- Action{Type: "MULTIPLY", Data: i}
}
close(actions)
}()
for a := range actions {
state = reducer(state, a)
}
return state.Value
}This makes the computation observable, recoverable, and composable.
🔄 5. Streaming Factorial Engine: Pause, Resume, Subscribe
By treating factorial as a stream, we can control its execution with context.Context and expose HTTP endpoints for pause/resume.
type IncrementalFactorial struct {
current int
value *big.Int
mu sync.Mutex
}
func (f *IncrementalFactorial) Next() *big.Int {
f.mu.Lock()
defer f.mu.Unlock()
f.value.Mul(f.value, big.NewInt(int64(f.current)))
f.current++
return new(big.Int).Set(f.value)
}
func (f *IncrementalFactorial) Start(ctx context.Context, out chan<- *big.Int) {
go func() {
for {
select {
case <-ctx.Done():
return
default:
out <- f.Next()
}
}
}()
}Endpoints such as /pause, /resume, and /progress let callers control and monitor the engine.
"This is the leap from algorithm to system: factorial becomes a concurrent, controllable engine rather than a simple function."
🧠 6. Lessons Learned from the Factorial Journey
Deadlock teaches the importance of channel lifecycle management.
Fan‑out/fan‑in shows that architecture can outweigh raw algorithmic speed.
Redux‑style state makes concurrent computation predictable.
Streaming engines demonstrate that concurrency is not just parallelism but also controllability.
🧩 Easter Egg Challenge
Implement a controllable factorial microservice with the following HTTP routes: /start: launch incremental computation /pause: pause the goroutine /resume: resume computation /progress: report current n and n! Hint: combine sync.Cond with context.Context to regulate goroutine pacing.
Code Wrench
Focuses on code debugging, performance optimization, and real-world engineering, sharing efficient development tips and pitfall guides. We break down technical challenges in a down-to-earth style, helping you craft handy tools so every line of code becomes a problem‑solving weapon. 🔧💻
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
