A course on handgun safety. Thanks Deputy.
A course on handgun safety. Thanks Deputy.
(This opinion was a comment to a statement by the ZMan that people will choose safety over freedom – photog)
Not everyone chooses safety over freedom. People defying mask and social distancing orders are choosing freedom over what the elite says is safety (masks are little more than a placebo). Even more importantly, brave humans still man the wall in our military and police and firefighting forces.
Of my tribe, there are the Chief Mountain Hotshots, a Blackfoot force of para-jumping firefighters who fight wildfires in forests. They owe no fealty to the federal or state government and the overwhelming majority of the fires they fight are not on the reservation.
The police are still working, even in areas where sissy politicians are trying to defund them. They are still responding to assist citizens even in areas where they are supposedly forbidden to go by protesters.
There are things going on in our military that seldom if ever see the light of day. We have exceptionally brave and skilled people doing almost impossible things to protect us every day. Heck, just working on the most dangerous four-and-a-half acres in the world, an aircraft carrier flight deck, is hazardous even in peacetime. We lose more GIs to training accidents than to enemy action these days. They chose to protect freedom for others, almost all of them who they will never meet and in many cases would not like, over their own personal safety.
When you put your own body between home and the war’s desolation, you prove your fitness to be a citizen. Soldiers do not fight because they hate what is in front of them, but because they love what is behind them. To place your own frail body between home and danger is the ultimate test of citizenship. No sunshine patriots, these GIs and cops and firefighters. They are patriots to the bone.
I do not believe most conservatives fantasize over a civil war. It’s more of a nightmare. It will be pretty damned bloody and Americans killing Americans is not what any conservative wishes, even if we are willing to do it when push comes to shove. When aroused, Americans are insensate fighters. To awaken the berserker spirit within us is not a good idea. Ask the Japanese of the Pacific theater in WWII. There is a reason our civil war was our bloodiest in history. With well over 20 million veterans among the citizenry there are enough trained people to make even a foreign enemy think twice let alone a domestic enemy. That is why a civil war in the US is any true patriot’s worse nightmare.
To put the lie to all this talk of inter-generational warfare my friend at The Portly Politico, Tyler and I have decided to reach across the generational abyss and sponsor an exchange of posts. He has posted an essay here at OCF and I have posted one of my reviews at his site (see link below).
Seriously we thought it would be fun for his readers and mine to get some slightly different material for a change. I think injecting some other points of view on the site is a big plus.
[Update] – I saw what a nice intro Tyler gave to my post over at his site so I decided I should try to follow suit. As part of a cross-posting agreement, Tyler from over at the Portly Politico has kindly agreed to talk about what it’s like for the millennial generation to try to follow in the footsteps of their parents’ lifestyle. I think it will be valuable for Boomers, Xers and Millennials alike. Highly recommended.
By Tyler James Cook, The Portly Politico (https://www.theportlypolitico.com)
When photog proposed swapping blog posts in the comment section of The Fat Man’s “Cityscape at Night,” I was intrigued, and quite enthusiastic. That was before I succumbed to a gnarly head cold and worked a thirteen-hour day. But that sickly plight leads nicely into photog’s suggested topic: what are the major concerns of a young American today?
At thirty-five, I don’t know how “young,” I am, but it’s one of those ages where older people tut-tut when you suggest you’re aging. I suppose their advanced years have taught them otherwise, and that they’d much rather be a slightly creaky thirty-five than a croaky eighty-five.
Surprisingly, I am considered part of that great, reviled generation, the Millennials. I certainly don’t feel like one, what with my love of tradition, Christianity, and President Trump. I was born in a time when Internet usage was limited to college campuses and obscure Bulletin Board Systems, when we weren’t handed a Star Trek communicator with access to all the world’s knowledge—and it’s basest, filthiest indulgences—when we were five.
But we had Nintendo and cable TV, and all manner of luxuries and gadgets our parents could only dream of (although my parents apparently played Pong while dating). Suburbia was kind to my generation—too kind, as we grew up spoiled and allergic to hard work.
That said, not all Millennial whining is unjustified. Our parents—the latter Boomers and the early Gen-Xers—could support a family of four or five on blue-collar salaries. They also didn’t pay a fortune for college, and their college education taught them something useful, rather than Derridaean deconstruction of everything good and decent. That degree was also their ticket to the middle class.
We grew up being assured that if we followed the same path, we’d end up with similar outcomes; indeed, we’d be better off than our parents. For many Millennials, that was true: both of my brothers, for example, make very good livings in academia and the law. Access to the credentialed classes was greater than it had ever been in American history for my generation.
But one of the problems is that we could no long sustain a family on a working man’s family. Indeed, the girls we grew up claimed they didn’t want that. They wanted careers and academic accomplishments; the highest accolades of their chosen fields. Never mind that most of them finished out college with a useless B.A. in Psychology (the go-to degree for girls who don’t know what they want to study) and loads of debt; that just began their long 20s, that period in which they could explore and “find themselves.” Or they got married straight out of college after all.
The problem is that with excessive credentialing, degrees have become increasingly worthless. For example, I hold a B.A. and M.A. in History. That M.A. paid off in that it gained me a small initial boost in my teaching salary, and it made it possible for me to adjunct at a local technical college (never mind that I’m teaching the same material—often at a slower pace—to the college classes than to the high school students; the State wants to see that M.A.). Otherwise, it’s been largely an ornament, something my school can tout in its statistics about faculty qualifications.
I’ve managed to carve out a decent living for myself in rural South Carolina, but it’s required constant hustling and budgeting. To sustain myself (and sock away money for retirement), I work full-time at the high school; adjunct one or two classes online each semester; teach multiple private music lessons after school; organize and book my own shows to bring in revenue (mainly through merch sales); teach summer classes and camps; and, until this summer, work maintenance at school. For all of that effort, I scrape together around $50,000 to $55,000 a year (although I came close to $60,000 one year).
Self-employment taxes eat away at a good chunk of my private lessons business, which The Virus temporarily shattered (along with live gigs). I do fine for myself—I managed to buy a used car with earnings from music lessons in 2019—but if I had a stay-at-home wife and kids, there would be no way we could make it work.
For one, my health insurance would outrageous if I didn’t game the Affordable Care Act. In order to avoid paying $400 a month in premiums for a plan with a $6750 deductible (you read that right), I max out 403(b), traditional IRA, and HSA contributions, which gets deducted, for the purpose of ACA subsidies, from my gross income. That modified adjusted gross income, or MAGI, is low enough that the ACA considers me sufficiently destitute to pay out subsidies, so my $400 a month premium drops to around $1 a month.
Again, for a single man at thirty-five, it’s not a bad deal. I’m in relatively good health (and am dropping some extra fat) and have managed to squirrel away enough in my emergency fund to reach my deductible without touching my HSA contributions (I’m treating my HSA as an investment vehicle, with my contributions invested in various mutual funds). But if I were married with kids, it would be a whole different story.
I’m also blessed to have made it through college and graduate school debt free, and to have never had a car payment. That is a luxury—really, the result of extremely generous gifts from my parents and grandparents—that has enabled me to pursue a life of financial asceticism. If I had student loans and car payments, like many of my peers, it would be far more difficult to save and invest.
As it is, I feel like I work constantly just to provide a good life for Future Portly. The cost in the here and now, though, is palpable. Not only have I sacrificed energy, I’ve sacrificed some of the enjoyment of life. Those are necessary sacrifices to avoid becoming a ward of the State in my dotage, but the price seems very high—and one that it seems I must now bear alone.
To be clear, I don’t mean to complain. I am blessed to live a good life, and to own a house, free-and-clear. I enjoy a degree of financial autonomy that strikes awe in my peers.
But I don’t know if it’s sustainable with a family—what I want more than anything. The debased nature of modern dating—the topic for another guest post, perhaps?—puts a man with a traditional worldview and sound financial sense in a precarious situation. Having built my legacy, I don’t want to squander it on some Tinder harridan with a butterfly tattoo and blue hair. But the inflated nature of the modern dating marketplace makes even the greasiest of girls believe their beauty queens with only redeeming qualities.
(Editor’s Note: This essay addresses my diatribe against giant space amoebas.)
Gently, gently. Remember when those shows aired. It was the late 60s. We were shooting for the moon (and made it). Nothing seemed impossible to us. Back then a transistor radio the size of the palm of my hand was the latest and greatest portable music. 8 track tapes were all the rage for your car. Technology by today’s standards was Neolithic. Cars were not fuel efficient; their cubic inches were only surpassed by their horsepower ratings. Personal/corporate jets were becoming more popular. Most television programming was still in B&W, although prime time went all color in 1966. Color TVs did not outsell B&W televisions until about 1970. CB radios were just becoming the thing. Still mostly as trucker toys until the mid-seventies. Cars did not have cellular phones, they had radiophones and they were so expensive only the wealthy had them. Seat belts? Just beginning to attract notice. GPS? A dream someone in DARPA had. Night vision? First generation Starlight Scopes. Lasers? Sci-fi. Bell labs was doing some laser research but as a weapon, or as a range finder, or as a thermometer or as a pointer? That was genuine Star Trek stuff. Lost in Space had its first season in B&W. A self-aware robot? Laser pistols? FTL travel in a flying saucer? Golden aliens? Remember the sci-fi movies of the 50s and 60s. Godzilla, a mutant lizard so huge he would collapse under his own weight in the real world. Rodan, another impossibly huge reptile who incidentally could exceed the speed of sound without flapping his wings. The Giant Claw, an antimatter bird from another galaxy, here to lay eggs and destroy our world. Robot Monster, a movie with the villain as a man in a gorilla suit with a space helmet for a head. War of the Worlds with Gene Barry. A lot of eye candy in there, but science? Nah. When Worlds Collide, where we as a species land on an interloper planet when our world is destroyed and it just so happens to have a breathable atmosphere and earth-tolerant temperature.
All entertainment and even world events led us to suspend belief. For the USA, nothing was impossible. Colonize Luna (the Moon), then Mars. Mine the asteroid belt. Surely FTL ships would be along before our grandchildren passed away. It was a time of unbounded enthusiasm. Science fiction was mostly fiction and was pure escapism, entertainment.
Yet, many things have come true. Kirk’s communicator was our flip phone. His tricorder our Tablet PC. We have remote monitoring of body functions as McCoy had in sick bay. Our machines talk to us and are voice-programmable. Ever talk to Siri or Alexa? We can compress enough data on a postage stamp sized SD card (like Spock’s data discs) that would have taken three buildings filled with machinery and magnetic tape storage in the 60s. We can stream live events and movies in excellent resolution and stereo sound to our hand-held smart phones. We have access to most of the world’s information at out fingertips. We can shoot down planes and missiles with lasers.
Roddenberry didn’t dream big enough. My maternal grandfather was born in the nineteenth century. He told me of how things were when he was a child. He was literally born in the horse and buggy era. He remembers the big hooraw over the Wright Brother’s first flight, and he lived to see men walk on the moon. I in turn tell my grandson what it was like when I was a child. I also add in the parts about walking to the school bus drawn by a team of muskox in minus 40-degree weather through snow three feet deep and fending off dire wolves who were trying to get my school lunch made from mammoth tenderloins. Just for fun. But remember back in your childhood, photog. Compare it to today. You and me both listened to 45 RPM records. Technology is advancing faster all the time. My first airliner ride was in a Super Constellation, a prop-driven airliner. My first helicopter ride was in a Sikorsky H-34, the type Fernando Lamas piloted in The Lost World, with Michael Rennie.
We’ve come a long way pretty fast. Back then it took less suspension of disbelief than it does now.
Some case studies of innovation begin with a scientific advance such as the identification of the photoelectric effect or other quantum phenomenon and traces its application to an invention dependent on that advance such as the laser. Other descriptions are more ethnographic, observing an industrial ecosystem, then focusing in on its niche like the Connecticut River Valley manufacturing industry of the 18th century and its development of interchangeable gun parts. More quantitative accounts begin with economic dynamics by measuring the role of capital, labor and then try to show excess growth attributed to changes in technology processes or investment.
All of these approaches seek to account for growth not related to easily measurable factors by looking at newly discovered insights or newly introduced technologies that confer some advantage to an offering competing in a market. Many of these accounts are useful in documenting the precedent conditions to productive change. They have been reduced to a list in many papers and articles on innovation and economic growth. They include access to basic research and related intellectual property, capital, talent, geographic or virtual proximity and so on. Other less concrete factors are also named such as entrepreneurism, leadership or vision. This body of literature is rapidly growing but the more that is written about innovation and the greater the attempts to reduce it to an economic model, the further the goal seems to move. The sudden drop in the total factor productivity in the US after the 1970s seems less understood the more that is written about it. Commentators, whether economist or philosophers, business leaders or politicians, have moved from qualitative analysis to social pleading yet offer no reliable, let alone predictable, hypothesis.
To some, the loss of American vitality is seen as an emergency, a surrendering or dissipation of the most valuable trend in human history. The loss of a cultural and economic heritage that transformed the world from a brutal place to a prosperous one. To others the change was the inevitable correction as resources were redistributed by political systems evolving away from their imperial structures of exploitation. Why do some students and proponents of innovation see it as somehow related to culture? Why do discussions of innovation seem to invite political explanations? At any level of analysis, it would seem innovation has almost nothing to do with politics and philosophy, rather a question of science, economics, and commerce. It is true that politics influence and at times determines investment in science and seek to manage economies, if not specific markets, but does that mean we can find the source of innovation in political processes?
The issue of what changed that precipitated the reduction in growth of the US economy and, apparently, innovation has a stock list of suspects. Government regulation is a commonly cited culprit. In the case of nuclear energy this seems irrefutable. Corporatism is another clear candidate. Anyone who carefully analyzes big company structures and processes, from their silo functions to their anti-competitive strategies and general slow-footedness knows that the landscape of a shrinking number of large companies dominating legacy industries can only be poison to innovation. It is hard to consider these and other familiar hypotheses that purport to account for the decrease in innovation, such as failed schools, family breakdown and the loss of faith, without turning away from the question in despair, even horror.
Perhaps it is better to start with a more direct examination of innovation in the past versus today. For example, the slowing of progress in individual transportation in the last fifty years. Why don’t cars fly? It is harder to make a car fly than roll so innovation today won’t look like innovation a century ago. This is the low hanging fruit explanation, flying is harder, but what does that mean? Well, making a car fly is not an incremental change from progressively making cars roll faster and more efficiently. In fact, making a car fly may not be an innovation at all. Innovation is not the invention of new things for their own sake. Innovation solves replication problems. What replication problem does a flying car solve? How much faster does individual transportation need to move over the earth’s surface than a mile a minute? And, for that matter, how much faster than a mile a second does flight need to achieve? The low hanging fruit explanation does seem to touch on something useful, but not in the ordinary sense of the barrier of increasing complexity. It also points to the question of need.
Commentators point to aging American cities with their 19th century subways and mid-20th century skyscrapers as evidence of our decline. (We might observe, as an aside, that no one ever complains about the age of buildings in Rome or Paris) They point to slower travel times, increasing real energy costs and shortening life expectancies in the same breath to demonstrate the drop in the pace of innovation. These seem alarming symptoms of our loss of progress. But are they really? How high does a building need to rise? How often should they be replaced? How many millions should a city accommodate? Subways certainly age, need to be maintained and improved, but should a civilization’s innovative energies be focused on subways? Surely this is not a problem of complexity, nor was the decision to abandon supersonic transport. These are choices that have little to do with innovation as normally discussed.
It is clear that in the postwar period, in different forms in Europe, the Americas, Asia, and the rest of the developed world, much of these societies’ productive energy was focused on “social progress”. Some would call much of it, the changing role of women, concern for the environment, other post-imperial transitions like industrial nationalization and the rise of the welfare state, social engineering that at least in name might be considered innovation. These large reallocations of resources and dislocations of existing social structures undoubtedly had equally large effects on the focus of our productive energies, if not to derail them. For much of the industrial world social progress represented a deliberate regression away from the culture of Manhattan Projects and moon shots. Social progress led not to building more advanced cities but housing projects for the poor, which, in turn, led many to leave cities altogether. In America, the suburban “innovation”, born of the federal interstate highway program, made things cheaper, more convenient at first, and increased standards of living substantially for at least two decades. But it did not just increase the marginal quality of upper middle-class family existence by eventually sending most women into the workforce and expanded the average size of a suburban house and the number of cars in their driveways. Living standards per capita measured in occupied square feet, miles driven, cost per student, ballooned in the 1970’s and 1980’s until even lower middle-class families living outside of cities occupied larger houses, drove further and spent more per student on education, even consumed more calories, than their counterparts in any other society. Was this not productive change?
Many would say no. Those social and economic changes may have been desired after the two wars and the prospect of global extinction, but they did not yield what innovation always does. Doing more with less, rather the opposite. Reallocation and baby booms might be products of innovation, but they do not bring it about. But the social and material changes in family structure and standards of living do suggest an answer to our question of why building and subway construction have not advanced. They didn’t need to, certainly not with the suburbanization of society and the massive expansion car culture.
There are parallels of this redirection of innovation in energy, in air transportation, even in medicine. A central concept to the development of new medical therapies is the idea of “unmet need”. Still at the dawn of the 20th century most people in the world died of gastric perforation. This mortality was directly tied to waterborne infections and contaminants so the unmet medical need for gastric disease was very high in the year 1900. Epidemiology showed not just mortality, but morbidity, other suffering than death such as poor nourishment, pain, and loss of work, were also caused by digestive disease. At first, slowly through the improvement of urban waste management and water treatment, and then more quickly after World War II through development of a series of pharmacotherapies such as antibiotics, then H-2 antagonists, PPI’s and finally triple antibiotic therapies, the medical unmet need for upper gastric disorders has largely been addressed.
This does not mean that no one suffers an upset stomach anymore. Prosperity and the overabundance of calories ensure that people still need digestive therapies. But as a public health priority, upper digestive disease has fallen from top to bottom. This is reflected in the demand for infrastructure professionals and new upper digestive pharmacotherapies that address digestive disease. Public engineering in the first half of the 20th century in America was a leading professional undertaking as the nation built its cities to postwar capacity. Those same H-2 antagonists and PPIs were the world’s largest selling and most lucrative drugs to treat aging patients born while H. Pylori, a water born pathogen, was prevalent. Today large-scale hydro-engineering projects occur at a small fraction of their former frequency and the gross sales of gastric pharmacotherapies and the innovative creation of new ones are comparatively tiny and few.
Is the contraction of PPI markets and the reduction of sewer treatment projects evidence of an innovation crisis or reduction in unmet need? Why has subway and high-rise construction investment fallen? In the 1920s as the New York City subway system was completed and was the envy of the world, the city had between 8 and 9 million residents that paid a billion fares per year. Those numbers are still largely the same today. Before the completion of most high-rise housing, New York City reached its steady state of population. By the 1970s and during the decades of the decline in US total factor productivity, national firms and their employees were abandoning New York City, raising vacancy rates. So why build and innovate more subways, buildings and their associated technologies? What was the unmet need? The answer is, there was none.
The only objection raised by these facts, that even the poor in the West have excessive basic resources in calories, in utilization of individual transport, spending on education and housing space, is that people are still poor and life for many is grim. But is this a problem of innovation, of productive growth? Would making energy free, as once imagined, or food free, as it nearly is in terms of minimum daily calories, make life less grim? The answer is no, with the sole exception of the extremely poor, defined by the World Bank as less than $1 dollar-a-day of income, a vanishingly small population in the US and one not attributable to jobless or homeless conditions but mental illness and drug addiction. There is no evidence that more square feet or more individual driving or more spending on education will meaningfully reduce the true unmet needs of lower income people. It may make car companies, energy companies, landlords and teacher unions richer but greater innovation in individual transportation, education, energy and food production will not reduce unmet needs in these areas because they are already so low. No amount of additional spending above the already impossibly high per student costs to simply teach a first grader to read will improve literacy rates. Even $100,000 per student per year would not improve the reading scores of the urban and rural poor. And if it did, such improvement would not be due to innovation, which we have defined as doing more with less. Rather, by reducing the scarcity of these resources, suburbanization has led to their inflated worthlessness. Cheap goods and services have led to the devaluing of them to the point of laxity. Is reflected in obesity rates, lowering test scores, falling birthrates, which for any other living system of organisms, would rise with expanding resources. That is until their own waste chokes them. This is the cradle of our heroes, The Muppets.
End of Part 3
Ok, if necessity is actually the mother of innovation, lots of needs have been met in the last 100 years, but why did growth stop, the ASB becomes irrelevant and suburban consumerism take hold and become the millennial Muppet cradle sometime in the nineteen seventies? And what about Frank Sinatra? Stay tuned for Part 4.
Warning: Part 2 contains a philosophical discussion of innovation that is a bit dense. If you’re here for the comic jabs at “The Muppets”, you may want to skip to Part 3. My apologies.
(Editor’s note: Because the author was so expansive, I have divided Part 2 into two parts. So, what The Fat Man refers to as Part 3 will actually be called Part 4.
The hypothesis I will posit and attempt to demonstrate in the next two parts of this humble correspondence has two main themes. First, that the America of the hundred to hundred and fifty-odd years ending in the nineteen seventies was in every way exceptional; second, that it was so because it had to be.
What gave birth to the ASB that catalyzed an array of naïve musical craft forms into a global cultural phenomenon? How could it be that slave and peasant musical traditions could be combined and transformed to such success? How did a string of still photographs projected on a screen go from peep show to a universal, dare we say, artistic medium? And how did both these forms descend into their own basements? Why even is the use of a phrase like “artistic medium” to be feared and derided?
What if the same dynamic could be identified as the driver behind the creation General Electric and The Bomb that obliterated those two Japanese cities. What if accounting for that dynamic could answer Peter Thiel’s most interesting questions, “Why are our cities strangely old?”..…”Why did the space program abandon Mars?”…..”Why does it take longer to travel between cities in 2020 than it did in 1970?” Put more simply, how can the America that stormed Normandy and called a moonshot in 1961 “by the end of the decade” with Ruthian certainty end up frightened by Antifa?
To answer all these questions, we first need a definition of innovation that helps to describe some common process to all the unlikely triumphs we have mentioned, from Louis Armstrong to Robert Oppenheimer. We need a definition that comprises economic trends reflected in metrics like the GDP, and the commercial success of mechanical innovations like the production of replaceable parts in firearms; cultural phenomena like the art movements that come to be described as “universal”, or the emergence of global capitals like New York in the mid-century.
What is innovation
Galileo, Newton, Einstein and Heisenberg. These names transcend words like discovery and invention. For human beings, the members of this class are, along with a few others seemingly from other fields, other names like Homer and Shakespeare perhaps Mozart or Beethoven, the ones that define our world. We don’t have to worry about their sins or similarities because they are like their creations, both real and unreal. There is no E in E equals MC squared in the real world, any more than the number one. E and one are exclusively human. There is no ideal realm where they reside outside of our minds. They are beyond the hills, the animal or mineral, shared only in the humanly conceived eternal. They are wholly ours and once invoked by anyone they join the patrimony that is accessible to all if we choose to claim it. We can choose, however, to lose treasures like F equals MA or “it is the east and Juliet is the sun” or Euler’s identity. We can forget or revise or misattribute or commit a hundred other crimes against history. We can break the chain of humanity that links all ages and places to every remembered and forgotten name with the new and the unborn. We can fail to imagine.
Lesser mortals do lesser things. They discover like Columbus or Curie; they invent like Edison and Bell. A lightbulb is not humanity but it helped humanity read. The telephone was not a part of us though they did at times seem attached. America is not Italy but someone had to sign the map. We remember these names and forget, revise, misattribute them at much less peril, perhaps some would say, at no peril at all, perhaps, even to our benefit. But the status of the names of our discoverers and inventors matter today if not tomorrow. We need them today to tell our story, even our history, but they are not immutable giants like the others. Because we all know who gets to write history, the stories beneath these names can change from discoverers today to slavers tomorrow.
Far below the Olympian pantheon of Newton and the discoverer’s Rushmore of Edison, in a stratum of the day to day, lives innovation. It has no name but certainly is more fun. Discovery finds things and invention makes things but innovation gets to do things. And nameless, it is free to beg and borrow, not caring who found it or made it so long as it can use it. Innovation is the doing with what was discovered, invented, invested, neglected or just plain forgotten.
Innovation has no name, or at least it shouldn’t. The artifacts of innovation are not important, but their impact is. What is a subway or a skyscraper? Who would care except that they move infinitely more people faster in a crowded city than any combination of horse and car or fit infinitely more people to live and work on a half-acre than possible in any other urban plan? But innovation does not only serve the visceral. The long line of innovations that culminated in the gothic cathedral are nameless. But at some point, in the 11th or 12th Century, they lifted whole societies to spiritual consensus. Yet there is no name associated with the Gothic Cathedral except Chartres, Cologne or Notre Dame. In fact, subways, skyscrapers, cathedrals, choirs or even particular iPhones change as we use them and disappear when we don’t. Innovation doesn’t have his fun alone, we get to join in.
In the sense that innovation is not discovery or invention we can also say that it is not exclusively human. Because it is nameless is also, to the extent it is distinct, not aware. Innovators manage the details of their initiatives and even at times claim to plan their applications. But no one ever knows when they cross the boundary between an improvement or invention or discovery and true innovation. So as anyone who has ever seen the cat finally achieve the canary knows, animals innovate as well. Nor does one individual even ever really innovate. Beyond the clichés about standing on the shoulders of giants, innovation relies primarily on feedback loops whether from a market or a metabolism. And beyond animals, all biological systems possess in their ontogeny the mechanisms of not just change but proliferative innovation. From this perspective, no doubt, it is conceivable that by their ability to determine natural existence, the laws of physics in their constants and relations and limits do as well. Or at least one could probably find a business-minded physicist to agree. So, it is also cliché to say innovation is collaborative or diverse or possessing of secret ingredients, let alone genius. Innovation emanates as all phenomena do, that is to say, through itself.
This view of innovation is useful in a number of ways. It avoids the sociology of science associated with the Olympian creations that began our discussion. Newton’s human creations like numbers and letters truly are human constructs, artifacts. Concentrated matter moving through space is no artifact. The novel phosphorylation of a bioactive molecule that confers a replication advantage is a fact, observable, unaware, unstoppable. Humans can only participate in innovation; they cannot originate it. We are lucky when we properly observe it.
If innovation is not human then it must be free from the requirements of human logic. Innovation is not consistent or moral or balanced or meaningful beyond the very next step. Innovation is productive change and with that single modifier, alone it is unconstrained in ways no human system can be. It can comprise blitzkrieg and washing machines. It moves along paths that cross all boundaries and all borders. It can change its products, landscapes and even man-made literary forms. Innovation is free to impinge on domains that are aware and self-constrained without being so itself.
All we have said so far describes what innovation is not and qualities of its nature. But what is innovation? Economists define innovation as the translation of an idea or invention into a good or product that creates value as reflected in the customer’s willingness to pay for it. So, innovation in this context is the occurrence of a new offering to generate sales. But innovation is also a larger concept usually best measured by the economic idea of dynamism. Dynamism is defined as the creative destruction in an economy that reallocates resources across firms and industries according to their most productive use. Presumably this destruction can at least in part be bottom up, unplanned or subject only to market guidance.
In its broadest sense, as we have discussed it so far, we might simply define innovation as productive change. Change that moves in a self-defined positive direction. A successful virus is essentially a protein shell with an innovation factory coded into its genetic material. Its sole function is to continually make slightly inexact copies of itself so to ensure that some of its related progeny can survive the immune systems that act as it’s feedback loop. To that virus this is productive change or innovation.
So, when is change productive or destruction creative? The laws of physics and biology seem to imply these are oxymorons? Science holds that all change is random, certainly all destruction must be, so how then can it be productive and creative? Does not its anonymity and randomness exclude any notion of “positive”? The answer must be no, but only because reconciling these seeming contradictions leads directly to the question of intentionality and the origin of change. The origin of change is itself a question of first causes that, as we have said, is immanent yet unbounded by space and time. Even a physicist would agree that the universe is productive because of primal conditions whose own origins are inexplicable, partially observable, even describable, perhaps, but ultimately unaccountable. But where does the ineffability of productive change lead us in our search for its nature? It frees us. Clearly productive change exists as do distinct stars that convert matter to energy and men who turn forests to farms, so we are free to inquire and observe without accounting for first causes. In our investigation, we also can be dynamic along with others in our niche and join in the reallocation. But as human logical commentators, at least, we are obliged to make observations that suggest relationships, if not lessons.
So much for the ultimate source of change, what about proximate causes? What about their number and weight? This is not obvious yet it is the main business of our discussion. And although economics would seem to be the obvious framework to account for the proximate cause of innovation, those most familiar with that exercise commonly offer only very subjective, sometimes poetical explanations of even large changes in innovative trends. The great economist of innovation, Edmund Phelps, cites the loss of the “spirit of adventure and discovery” as chief among the proximate causes of the halving of the 3% annual growth in US GDP he attributes to American innovation going back two centuries before the 1970s. To understand the proximate causes of the end of American innovation in the 1970’s, we must first understand its proximate causes going back at least those two centuries and likely much earlier.