Operations 2 min read

How to Unlock Elasticsearch Write Block and Fix Logstash 403 Errors

When Kafka backs up and Logstash fails to write to Elasticsearch with a 403 cluster_block_exception, the index is locked by a write block, which can be resolved by clearing the index.blocks.write setting via the Elasticsearch API or Kibana dev tools.

Practical DevOps Architecture
Practical DevOps Architecture
Practical DevOps Architecture
How to Unlock Elasticsearch Write Block and Fix Logstash 403 Errors

Phenomenon: Kafka backlog causes Logstash to fail writing logs to Elasticsearch.

Logstash error:

[logstash.outputs.elasticsearch][main][] Retrying failed action {:status=>403, :error=>{"type"=>"cluster_block_exception", "reason"=>"index [] blocked by: [FORBIDDEN/8/index write (api)];"}}

Cause: Kibana dev tools show the index setting index.blocks.write: true, which locks the index and prevents writes.

Solution: Remove the write block so the index can accept writes.

Method 1 (CLI): Execute on the Elasticsearch server:

curl -uusername:password -XPUT -H "Content-Type: application/json" http://localhost:9200/_all/_settings -d '{"index.blocks.write": null}'

Method 2 (Kibana Dev Tools):

PUT /_all/_settings
{
  "index.blocks.write": null
}
KafkaTroubleshootingLogstash
Practical DevOps Architecture
Written by

Practical DevOps Architecture

Hands‑on DevOps operations using Docker, K8s, Jenkins, and Ansible—empowering ops professionals to grow together through sharing, discussion, knowledge consolidation, and continuous improvement.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.