Nearly Right

The education of corporate silence: how experienced engineers learn to watch projects fail

Senior professionals develop an instinct for strategic inaction. The institutional costs are higher than anyone admits.

A senior Google engineer recently explained to a junior colleague why he thought a sister team's project was doomed. Poor early design choice. Failure visible months out. The junior asked the obvious question: why not tell them?

It was the same question the senior engineer had asked his own manager years earlier, watching him dismiss another team's ambitious project with a shrug and a prediction of failure—then do nothing. To his younger self, the waste had seemed unconscionable. If you can see the iceberg, why not shout?

The answer he gave his mentee was the answer he had spent years learning: being right and being effective are different things. Arguing with people who will not listen costs more than it saves.

This is not a confession of cowardice. It is a description of how experienced professionals actually operate—and the institutional costs of that adaptation are higher than most people want to acknowledge.

The economics of influence

Lalit Maganti, the engineer in question, describes organisational influence like a bank account. You earn small deposits by doing your job well, helping colleagues, shipping successful projects, staying low-friction. Every time you raise concerns, you make a withdrawal.

The withdrawals are not equal. A code review nitpick costs five dollars. Challenging an architectural decision costs five hundred. Trying to kill a vice president's pet project? Fifty thousand dollars. You might afford one of those in a career.

Spend five dollars on every inefficiency you notice, and you will be bankrupt when you need to write the big cheque. Go overdrawn and you enter political insolvency: people stop inviting you to meetings, stop asking your opinion, start working around you. Your influence drops to zero.

This is why experienced engineers let projects fail. Not from apathy. From arithmetic.

The paradox of institutional self-knowledge

The irony of hearing this from a Google engineer is sharp. In the early 2010s, Google studied what made teams succeed. Project Aristotle analysed 250 attributes across 180 teams, expecting intelligence or experience to predict performance.

Instead, they found psychological safety—the sense that people could raise concerns without fear of embarrassment—was the single strongest predictor of team effectiveness. It accounted for 43 per cent of variance in performance. Teams with high psychological safety were more productive, more innovative, and had dramatically lower turnover.

Amy Edmondson, the Harvard professor whose research Google built upon, has documented this for two decades. Her hospital studies found that teams reporting more medication errors were not making more mistakes. They felt safe enough to discuss them, which let them learn. When people are afraid to speak up, preventable mistakes go unchecked.

Google knows this. The company has published extensive guidance on fostering psychological safety. Yet the same organisation produces engineers who have spent years learning that strategic silence is the rational choice.

This is not hypocrisy. It is something stranger: organisations can know what makes them effective while systematically incentivising the opposite. The gap between institutional knowledge and institutional practice is where careers learn to be strategic.

The Cassandra coefficient

In Greek mythology, Cassandra was cursed to see the future but never be believed. Richard Clarke and R.P. Eddy, former White House national security officials, borrowed her name for research into experts who correctly predict disasters and are systematically ignored. They found Cassandras who warned of Katrina, Fukushima, the 2008 financial crisis—all right, all dismissed.

The problem was not that these experts lacked data. It was that accurate warnings look identical to unfounded pessimism until the disaster arrives. Nobody gets credit for the catastrophe that was prevented. Warn about something that never occurs—because you warned about it—and you look like you cried wolf.

This creates the Cassandra Tax: a penalty on accurate but unwelcome information, paid in political capital. Pay it too often and you bankrupt your career.

Evolutionary biology offers an unexpected lens on this dynamic. Costly signalling theory examines when organisms choose to send signals—and when they suppress them. The traditional view held that signals must be costly to be honest: the peacock's absurd tail proves fitness because only a healthy bird can afford to carry it.

Recent research complicates this. What matters is not the cost of signalling accurately, but the cost of signalling at all. An organism will suppress an honest signal if sending it costs more than the expected benefit. The peacock displays when a mate is watching. He folds the tail and hides from predators.

Senior engineers have learned the same calculation. The cost of raising a concern is not the effort of speaking. It is the reputational damage of being seen as negative, the political capital spent on a project you do not own, the relationships strained with people who did not want to hear it. These costs are paid whether or not you turn out to be right.

Being right is not enough. Being right and being believed, at acceptable cost, is what matters. The two rarely coincide.

The slow accumulation of uncorrected mistakes

Frances Milliken and Elizabeth Morrison, who have studied organisational silence for decades, found that 85 per cent of employees have felt unable to raise a concern they believed important. The most common reasons: fear of being viewed negatively (30 per cent), fear of damaging relationships (27.5 per cent), and belief that raising the issue would not lead to change anyway (25 per cent).

That last category—belief in futility—is the most troubling. It represents learned helplessness: the state that develops after repeated failed attempts to influence outcomes. Psychologist Martin Seligman discovered the phenomenon in animal experiments, but it applies equally to humans who have learned through experience that their efforts do not matter.

The transformation from idealistic junior to strategic senior follows a pattern. Edmondson's recent research found that new employees often arrive with high psychological safety—confidence that their employers want to hear from them. Over time, that confidence erodes. They learn which topics are discussable. They learn whose projects can be questioned. They learn that some concerns, however valid, are not worth the career risk.

What makes this especially troubling is what it does to organisations over time. When experienced people stop speaking up, institutions lose access to accumulated wisdom. The engineer who has seen three similar projects fail recognises the pattern in the fourth—but has learned that saying so costs more than staying quiet.

Maganti recounted a vivid example from Google. A high-profile project was announced at the intersection of two enormous organisations, positioned as a game-changer. Technically elegant. Full of clever solutions to hard problems.

Maganti and his lead recognised the fatal flaw immediately: the project required a flagship product team to cede control of their core user flow to a platform team. No lead would ever surrender something that central. Politically, the project was fantasy.

They said nothing.

The project kept chugging for nearly two years, missing launch after launch. Eventually the strategic pivot email arrived. Resources reallocated. Code deleted. Everyone told the company had learned a lot.

Two years of engineering hours, spent on a project that people with experience could identify as doomed from the announcement. But those people had learned the cost of speaking up exceeded the benefit.

What cannot be fixed from below

There is a counterargument, and it deserves engagement. If psychological safety research is correct—and the evidence is substantial—organisations should invest heavily in creating environments where speaking up is rewarded. Leaders should model vulnerability, invite dissent, respond constructively to unwelcome news.

The problem is that this describes cultural transformation, not individual strategy. An engineer who speaks up more, trusting the organisation will eventually reward candour, takes personal risk for collective benefit. If wrong about the culture—if the organisation is not actually ready to hear concerns—they pay the price alone.

Ninety per cent of employees in one survey reported feeling unsafe to speak their mind at least once in the previous eighteen months. The organisation's median behaviour may be psychologically dangerous even if leadership believes otherwise.

Individual professionals cannot wait for culture change. They have careers to navigate, mortgages to pay, limited stores of political capital. Maganti is not arguing against psychological safety as a goal. He is describing survival strategy in organisations as they actually exist.

The view from experience

The honest conclusion is uncomfortable. The gap between what research says organisations should do and what career incentives actually reward may be unbridgeable at the individual level. The best experienced professionals can do is choose moments wisely—spend influence where it matters most, build quiet contingency plans, accept that some disasters will unfold exactly as predicted while they watch.

Maganti ended his account with a phrase that captures something essential about this adaptation: the grim satisfaction of predicting exactly how things would fall apart. Not as satisfying as fixing everything, he acknowledged. But it works. It keeps him sane.

There is something almost tragic in that formulation. A generation of experienced professionals, possessed of hard-won wisdom about what works and what does not, have concluded that sharing it is usually not worth the cost. They have learned to let projects fail—not because they do not care, but because caring too publicly is a risk they cannot afford.

The organisations they work for have, in effect, disabled their own early warning systems. Not through malice, but through incentive structures that reward conformity and penalise dissent. Research says this makes organisations less effective. Experience says it makes individuals more employable.

It is a tension that no psychological safety training will fully resolve, because it is rooted in something deeper than culture: the fundamental misalignment between what organisations need from their people and what they actually reward. Until that changes, the education of corporate silence will continue—one disillusioned junior at a time, learning from a senior who once asked the same questions they are asking now.

#leadership #software development