Connect with us

Trending

Complex Systems Won’t Survive the Competence Crisis

Published

on

At a casual glance, the recent cascades of American disasters might seem unrelated.

In a span of fewer than six months in 2017, three U.S. Naval warships experienced three separate collisions resulting in 17 deaths. A year later, powerlines owned by PG&E started a wildfire that killed 85 people. The pipeline carrying almost half of the East Coast’s gasoline shut down due to a ransomware attack. Almost half a million intermodal containers sat on cargo ships unable to dock at Los Angeles ports. A train carrying thousands of tons of hazardous and flammable chemicals derailed near East Palestine, Ohio. Air Traffic Control cleared a FedEx plane to land on a runway occupied by a Southwest plane preparing to take off. Eye drops contaminated with antibiotic-resistant bacteria killed four and blinded fourteen. 

While disasters like these are often front-page news, the broader connection between the disasters barely elicits any mention. America must be understood as a system of interwoven systems; the healthcare system sends a bill to a patient using the postal system, and that patient uses the mobile phone system to pay the bill with a credit card issued by the banking system. All these systems must be assumed to work for anyone to make even simple decisions. But the failure of one system has cascading consequences for all of the adjacent systems. As a consequence of escalating rates of failure, America’s complex systems are slowly collapsing.

The core issue is that changing political mores have established the systematic promotion of the unqualified and sidelining of the competent. This has continually weakened our society’s ability to manage modern systems. At its inception, it represented a break from the trend of the 1920s to the 1960s, when the direct meritocratic evaluation of competence became the norm across vast swaths of American society. 

In the first decades of the twentieth century, the idea that individuals should be systematically evaluated and selected based on their ability rather than wealth, class, or political connections, led to significant changes in selection techniques at all levels of American society. The Scholastic Aptitude Test (SAT) revolutionized college admissions by allowing elite universities to find and recruit talented students from beyond the boarding schools of New England. Following the adoption of the SAT, aptitude tests such as Wonderlic (1936), Graduate Record Examination (1936), Army General Classification Test (1941), and Law School Admission Test (1948) swept the United States. Spurred on by the demands of two world wars, this system of institutional management electrified the Tennessee Valley, created the first atom bomb, invented the transistor, and put a man on the moon. 

By the 1960s, the systematic selection for competence came into direct conflict with the political imperatives of the civil rights movement. During the period from 1961 to 1972, a series of Supreme Court rulings, executive orders, and laws—most critically, the Civil Rights Act of 1964—put meritocracy and the new political imperative of protected-group diversity on a collision course. Administrative law judges have accepted statistically observable disparities in outcomes between groups as prima facie evidence of illegal discrimination. The result has been clear: any time meritocracy and diversity come into direct conflict, diversity must take priority. 

The resulting norms have steadily eroded institutional competency, causing America’s complex systems to fail with increasing regularity. In the language of a systems theorist, by decreasing the competency of the actors within the system, formerly stable systems have begun to experience normal accidents at a rate that is faster than the system can adapt. The prognosis is harsh but clear: either selection for competence will return or America will experience devolution to more primitive forms of civilization and loss of geopolitical power.

From Meritocracy to Diversity

The first domino to fall as Civil Rights-era policies took effect was the quantitative evaluation of competency by employers using straightforward cognitive batteries. While some tests are still legally used in hiring today, several high-profile enforcement actions against employers caused a wholesale change in the tools customarily usable by employers to screen for ability. 

After the early 1970s, employers responded by shifting from directly testing for ability to using the next best thing: a degree from a highly-selective university. By pushing the selection challenge to the college admissions offices, selective employers did two things: they reduced their risk of lawsuits and they turned the U.S. college application process into a high-stakes war of all against all. Admission to Harvard would be a golden ticket to join the professional managerial class, while mere admission to a state school could mean a struggle to remain in the middle class.

This outsourcing did not stave off the ideological change for long.

Read the rest HERE.


Human-Trafficking, Mind-Control and the CIA


Read the full article here

Trending