Dillygence

Bottlenecks: diagnosing through simulation

The factory bottleneck isn't always visible. Methodology, measurements, and simulation to find the constraint capping the throughput.

Introduction : identifier le vrai poste limitant, pas celui qui "se voit"

In many factories, a single under-sized workstation is enough to create weeks of WIP (Work In Progress, work-in-process inventory), without anyone being sure which one it is. The consequences show up everywhere: customer lead times slipping, overtime, firefighting, then CAPEX (capital expenditure) launched at random. The point is not “having a goulot d étranglement”, the point is to prove where it is actually located. European industry loses margin by optimizing what shines on the shopfloor, not what caps the global throughput.

Most top-ranking content often stops at a definition and two generic examples, then a list of lean (lean manufacturing) tools. Rarely a full decision method end-to-end, even less a protocol that can falsify a diagnosis. The outcome is predictable: you treat a symptom, then the problem moves elsewhere.

Key takeaway: Accumulated stock is an insufficient indicator: it can signal a constraint, but it does not locate it on its own. In complex or variable systems, dynamic modeling is needed to locate the constraint reliably.

 

Definition : a goulot d étranglement is the resource that limits the overall throughput of a production system at a given time. In plain terms: the bottleneck production (production bottleneck) often corresponds to a capacity constraint that blocks shipping.

The 3 signs of a bottleneck:

  • persistent saturation

  • direct impact on shipped throughput

  • sensitivity to micro-stops

 

I- Define the constraint that caps throughput: definition, scope, and a useful metric

A goulot d étranglement is defined simply: the resource that caps the system's overall throughput over a given period. The sign is not “a lot of stock”, the sign is “throughput does not exceed the capacity of that point”. The useful metric is: overall throughput = effective capacity of the constraint. As long as this constraint does not move, sellable output does not move either.

Goulot vs goulet: two spellings, one operational reality

In industry, “goulot” and “goulet” coexist in usage. Both refer to the same reality: a narrower section that imposes a passage limit. “Goulot d'étranglement” remains the most frequent form; “goulet d'étranglement” also appears depending on editorial preferences. What matters is not spelling, but the proof protocol.

Overall throughput, queue, saturation: the mini-model that avoids pointless debates

An industrial flow behaves like a queue: when input exceeds a workstation's capacity, WIP rises. If workstation A runs at 60 parts/hour and workstation B at 45 parts/hour, then B caps throughput at 45 parts/hour, even if A runs “flat out”. In that case, pushing A creates WIP (Work In Progress, work-in-process inventory), not revenue. Control becomes effective when it protects B, not when it “keeps” A busy.

TRS and OEE: useful indicators… but risky out of context

The TRS (Taux de rendement synthétique) and OEE (Overall Equipment Effectiveness, overall efficiency rate) describe local performance: availability, performance, and quality. They help hunt losses, but they do not automatically point to the constraint. A machine can show a low OEE while remaining non-limiting, because the system has slack elsewhere. The right approach is to combine TRS/OEE with flow analysis, queue observation, and sensitivity to micro-stops.

 

II- The “false bottleneck”: why intuition fails on the shopfloor

A false goulot d étranglement happens when a workstation looks blocking, while the real constraint sits elsewhere. The shopfloor often names the culprit based on visual proximity: upstream WIP, operators under pressure, agitation. Yet the most expensive cause often hides downstream, in quality, internal logistics, or dispatching rules (ordonnancement). Without proof, intuition burns time and budget.

Dispatching, batch sizes, and quality variability: three congestion generators

Unstable dispatching creates unnecessary changeovers, therefore setup time, therefore queues. Oversized batch sizes create “waves”: the downstream workstation gets too much, then nothing. Quality variability adds rework loops, therefore delays that propagate. Each factor can look like a bottleneck, while it is only degrading flow.

Mini-case: a minor downstream issue that turns an upstream workstation into the perfect culprit

What: on an assembly line, an upstream workstation ends up with massive WIP and becomes “the bottleneck” in daily discussions.

How: analysis shows a series of micro-stops on a downstream test station, then a slow restart that blocks flow.

Impact: the team improves test maintenance and adjusts the release rule; upstream WIP drops and shipped throughput increases, with no added equipment. Visible stock masked an intermittent downstream constraint.

 

III- Dynamic and “nomadic” bottlenecks: the constraint moves with product mix

In multi-SKU production, the constraint moves. Product mix changes cycle times, setups, priorities, and sometimes routing. A workstation can be limiting in the morning and not in the afternoon. A static method misses this reality and leads to investments that are “too late” or “in the wrong place”.

A CAPEX can lift a local constraint, then create extra load elsewhere. The system quickly finds a new limiting point, often not anticipated. Improving one workstation increases downstream inflow: if downstream cannot keep up, WIP grows and throughput time degrades. CAPEX should not be judged workstation by workstation; it should be judged by shipped throughput and customer lead time.

TOC applied statically: the spreadsheet trap

TOC (Theory of Constraints, theory of constraints) provides a powerful framework, popularized by Eliyahu M. Goldratt: identify, exploit, subordinate, elevate, then repeat. The trap is applying it statically in a spreadsheet, using averages and smoothed flows. An average erases variability, so it erases queues and threshold effects. The diagnosis looks clean on paper and wrong on the shopfloor. For a reference baseline, TOCICO (Theory of Constraints International Certification Organization, international TOC certification organization) summarizes the framework and its concepts.

 

IV- Identification procedure in production: observe, measure, prove

Identifying a goulot d étranglement requires a procedure, not a debate. The goal is to isolate the resource that caps shippable throughput, on a defined scope and time horizon. The method must combine shopfloor observation, data, and flow reading. Each step must end with falsifiable proof.

  1. Frame the system : define the scope (line, workshop, plant), group product families with comparable routings, cover several days or weeks to capture variability.

  2. Instrument the flow : measure cycle times, stoppages, scrap, and queues; record WIP by area, because its location carries information.

  3. Cross-check shopfloor and TRS/OEE : include periods of material shortage and blocking; a highly utilized station may not limit throughput if it feeds a downstream area already saturated elsewhere.

  4. VSM (Value Stream Mapping, value stream mapping) : connect physical flows and information flows, highlight waiting times that often dominate transformation times.

  5. Falsification test : verify saturation is persistent, that micro-stops translate into shipped throughput loss, and that changing the release rule alone is not enough to drain the queue.

Starvation (starvation) describes a station waiting for parts; blocking (blocking) describes a station that can no longer discharge. The real constraint rarely suffers starvation for any significant period. A frequently blocked station often signals a downstream problem, not a local constraint. This vocabulary clarifies mechanisms, therefore it clarifies decisions.

 

V- Why simulation beats the human eye

The human eye sees stock; it does not see the dynamics that created it. A small variation in cycle time, repeated, triggers a threshold effect and turns a smooth flow into a congested system. Delay propagation is non-linear: one minute lost in the wrong place can cost an hour later. Simulation reproduces these phenomena without disrupting the plant.

A goulot d étranglement does not always have a spindle and a control panel. It can sit in a lack of empty bins, an unavailable cart, too long a distance, or a nonsense replenishment rule. A flow simulation integrates these elements: travel times, storage capacities, route frequency. Teams sometimes discover the “limiting workstation” sits between workstations.

A digital twin reproduces the behavior of an industrial system with its rules, resources, and variability. It lets you compare scenarios before touching the plant: batch change, extra shift, new dispatching rule, or machine investment. Each scenario is judged on throughput, WIP, and throughput time. The shopfloor stays stable, decisions gain reliability — exactly the kind of trade-off Dillygence supports daily, with operational models grounded in reality.

 

VI- Treat and stabilize the constraint: flow synchronization

Once the goulot d étranglement is proven, treatment follows TOC logic: exploit, subordinate, elevate, then re-check. Order matters, because it avoids unnecessary spending. The goal is to increase shipped throughput and reduce lead time, not to “keep all machines running”.

Exploit, subordinate, elevate: the first three levers

Exploiting the constraint means removing avoidable losses on the limiting resource: setups, maintenance, quality, in-station logistics. One minute recovered on the constraint turns into sellable throughput. Subordinating means aligning releases to the limiting resource's capacity, not to the desire to keep upstream stations busy. Flow becomes pulled by the constraint; WIP drops and throughput time stabilizes.

Elevating means increasing the constraint's effective capacity. The trade-off often pits OPEX (operating expenditure) against CAPEX: extra shift, shifted hours, subcontracting, automation, new equipment. The criterion is not “how much it costs”, it is “how much additional sellable throughput it creates”. Simulation validates scenarios before spending.

Drum-Buffer-Rope: protect throughput

Drum-Buffer-Rope puts a metronome on the constraint. The drum (drum) sets the pace, the buffer (buffer stock) protects the constraint from variability, the rope (rope) limits upstream releases. The buffer is not stock “for comfort”; it is a device that protects throughput. Flow then stops collapsing at the first micro-event.

 

VII- Quantify impact and decide: from diagnosis to financial return

A goulot d étranglement must be treated with economic logic; otherwise improvement becomes a hobby. Shipped throughput drives revenue, lead time (throughput time) drives the customer promise, WIP drives trapped cash. EBITDA is fed by these three levers, often more than by chasing unit costs.

TOC literature reports significant throughput gains when the real constraint is found and protected, sometimes around 10% to 30% on certain scopes, depending on cases and variability level. A serious decision requires internal quantification via data and simulation. The decision tree must compare automation, reorganization, leveling, buffering, and hiring on throughput and lead time, before going to the shopfloor.

Two mini-cases

Throughput time drops; firefighting decreases because priority becomes legible.

Typical order of magnitude observed in workshops: -20% to -40% on lead time (throughput time) on multi-SKU assembly lines, after stabilizing releases via Drum-Buffer-Rope and a calibrated buffer, with no CAPEX (capital expenditure). This illustrates stability gains without heavy investment.

Case

What

How

Impact

Case 1

A machining line shows high WIP and unstable customer lead times.

Simulation shows a constraint that varies by reference, then task rebalancing and setup reduction on the truly limiting resource.

Shipped throughput increases and WIP decreases, with no added equipment. This illustrates a possible throughput increase without CAPEX when the constraint is found and protected.

Case 2

An assembly workshop releases orders early “just in case”, then drowns in WIP.

The team revises release rules using Drum-Buffer-Rope and places a calibrated buffer before the proven constraint.


 

VIII- Traps to avoid and countermeasures

  • Optimize outside the constraint. Lots of activity, little throughput.
    Countermeasure: no meaningful improvement effort without a quantified link to shipped throughput.

  • Overproduce to “stay busy”. WIP traps cash, lengthens queues, and increases quality risk.
    Countermeasure: limit releases based on constraint capacity and the buffer.

  • Manage by local KPIs. TRS/OEE green, customer red.
    Countermeasure: system-oriented dashboard — shipped throughput, throughput time, WIP, and plan stability.

  • Move the constraint without stabilizing it. The migrating bottleneck becomes chronic.
    Countermeasure: re-check after each action, adjust control rules, identify and protect the new limiting point.

 

In summary?

  1. A goulot d étranglement is not the machine that breaks down most often; it is proven by overall throughput.

  2. WIP is a symptom, not a verdict.

  3. The constraint moves, so a snapshot is not enough — you need a dynamic model.

  4. Simulation and the digital twin turn capacity trade-offs into quantified decisions.

Dillygence gives industrial companies the ability to locate the real constraint and test treatment scenarios through a digital twin—discover the Operation Optimizer

 

FAQ: bottlenecks in industry

Why does a bottleneck limit a system's throughput?

Overall throughput cannot exceed the effective capacity of the most limited resource along the shipping path. When that station saturates, it creates a queue and caps output. Optimizing elsewhere mainly increases WIP. Improving the constraint directly increases sellable output.

What is a “goulet” or “goulot d'étranglement”?

A “goulet” or “goulot d'étranglement” refers to the constraint that limits the passage of a flow, therefore the overall throughput. In production, it can be a machine, a manual operation, an inspection step, or a logistics resource. The term describes a system phenomenon, not just an isolated station. It is proven by its impact on shipping.

What is a bottleneck?

A bottleneck is the point that caps a system's output capacity over a given period. It is characterized by useful saturation and high sensitivity to micro-stops. It can be fixed or nomadic depending on product mix. Treating it should follow TOC logic, then re-checking.

What is the bottleneck effect?

The bottleneck effect appears when a local limitation creates a queue, inflates WIP, then lengthens throughput time. The system becomes unstable as utilization approaches a high threshold, because variability turns into waiting. The real cause lies in the constraint's effective capacity and how it is protected.

What is a synonym for “bottlenecks”?

Common synonyms include “constraints”, “limiting points”, “saturated resources”, or bottlenecks (bottlenecks). In TOC, “constraint” is the most precise term, because it links directly to overall throughput. “Congestion point” describes a symptom. The best term depends on what you measure: throughput or stock.

What is the difference between “goulot” and “goulet d'étranglement”?

There is no fundamental difference: “goulot” and “goulet” refer to the same concept. The difference is usage and editorial preference. In industry, the issue is not spelling but proving which point caps throughput. A measurement-and-simulation method settles the debate faster than spelling.