Understanding Go's Concurrency Model: Goroutine, Channel, Select, WaitGroup, and Context
This article explains Go's lightweight concurrency model, covering Goroutine creation, Channel communication, Select multiplexing, synchronization with WaitGroup and Context, common concurrency patterns such as worker pools and pub/sub, and provides best‑practice recommendations for building high‑performance concurrent programs.
One of Go (Golang)'s most powerful features is its lightweight concurrency model, which enables developers to write high‑performance concurrent programs easily. The model is built around two core concepts: Goroutine and Channel.
Goroutine's working principle and usage
Channel types and communication mechanism
Select statement for multiplexing
WaitGroup and Context for managing concurrent tasks
Common concurrency patterns and best practices
1. Goroutine: Lightweight Thread
Goroutine is a lightweight thread managed by the Go runtime. Compared with traditional OS threads, its creation and destruction cost is extremely low, allowing a Go program to spawn thousands of Goroutines effortlessly.
Basic Usage
Use the go keyword to start a Goroutine:
package main
import (
"fmt"
"time"
)
func sayHello() {
fmt.Println("Hello from Goroutine!")
}
func main() {
go sayHello() // launch Goroutine
time.Sleep(1 * time.Second) // wait for Goroutine to finish
fmt.Println("Main function")
}Output:
Hello from Goroutine!
Main functionGoroutine Characteristics
Lightweight: initial stack is only 2KB and can grow dynamically
Scheduled by the Go runtime, not tied to OS threads
GOMAXPROCS controls parallelism (default equals number of CPU cores)
2. Channel: Communication Between Goroutines
Channels allow Goroutines to communicate without sharing memory, eliminating data‑race issues.
Basic Channel Usage
ch := make(chan int) // unbuffered Channel
go func() {
ch <- 42 // send data
}()
value := <-ch // receive data
fmt.Println(value) // prints: 42Buffered vs Unbuffered Channels
Type
Characteristics
Example
Unbuffered Channel (
make(chan T))
Send and receive must be ready simultaneously, otherwise block
ch := make(chan int)Buffered Channel (
make(chan T, size))
Blocks on send when buffer is full; blocks on receive when buffer is empty
ch := make(chan int, 3)3. Select: Multiplexing Channels
The select statement lets a Goroutine listen on multiple Channels simultaneously:
ch1 := make(chan string)
ch2 := make(chan string)
go func() { ch1 <- "from ch1" }()
go func() { ch2 <- "from ch2" }()
select {
case msg1 := <-ch1:
fmt.Println(msg1)
case msg2 := <-ch2:
fmt.Println(msg2)
case <-time.After(1 * time.Second): // timeout control
fmt.Println("timeout")
}4. Synchronization Mechanisms: WaitGroup & Context
WaitGroup: Waiting for Multiple Goroutines
var wg sync.WaitGroup
for i := 0; i < 5; i++ {
wg.Add(1)
go func(id int) {
defer wg.Done()
fmt.Printf("Goroutine %d done\n", id)
}(i)
}
wg.Wait() // blocks until all Goroutines finishContext: Controlling Goroutine Lifetime
ctx, cancel := context.WithTimeout(context.Background(), 2*time.Second)
defer cancel()
go func() {
select {
case <-ctx.Done():
fmt.Println("Goroutine canceled!")
}
}()5. Common Concurrency Patterns
Worker Pool
jobs := make(chan int, 100)
results := make(chan int, 100)
// start 3 workers
for w := 1; w <= 3; w++ {
go worker(w, jobs, results)
}
// send tasks
for j := 1; j <= 5; j++ {
jobs <- j
}
close(jobs)
// collect results
for r := 1; r <= 5; r++ {
fmt.Println(<-results)
}Pub/Sub (Publish‑Subscribe)
type Topic struct {
subscribers []chan string
}
func (t *Topic) Subscribe() chan string {
ch := make(chan string)
t.subscribers = append(t.subscribers, ch)
return ch
}
func (t *Topic) Publish(msg string) {
for _, sub := range t.subscribers {
sub <- msg
}
}6. Best Practices
Avoid shared memory; use Channels for communication
Use select to handle multiple Channels, timeouts, and cancellations
Employ sync.WaitGroup and context to manage Goroutine lifecycles
Limit the number of Goroutines (e.g., Worker Pool) to prevent resource exhaustion
Conclusion
Go's concurrency model, based on Goroutine and Channel, enables developers to write high‑performance concurrent programs easily. Mastering these core concepts allows you to build efficient micro‑services, crawlers, real‑time data processing systems, and more.
php中文网 Courses
php中文网's platform for the latest courses and technical articles, helping PHP learners advance quickly.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.