What Remains to be Done?


To understand where technology is headed, one needs to understand where it is and why it got there. In the United States, the driving force has been unfettered competition operating in a free market system. Entrepreneurs who have provided value to customers have collected monetary rewards on a proportional basis to the value provided, while many others have crashed and burned in the attempt to do so. 

In the 1800’s, Adam Smith referred to this give and take between buyers and sellers as the workings of the “invisible hand” of the market; always seeking to provide a better outcome for the participants. Little did he know the power that the computer would give to this invisible hand.

For the last sixty years since the silicon chip was invented, scientists and engineers have relentlessly driven increases in processing capacity and speeds. In addition, application software sitting on top of the technical platforms have blossomed to address specific customer requirements. Artificial Intelligence, while not bringing Star Trek’s Data to life, has begun to uncover more and more information on individual consumer buying preferences.  Putting these trends together, we can conclude that the invisible hand of the market will use new technology as it becomes available to meet the personalized demands of consumers as these demands are systematically identified. Technological advances will drive personalization into every fabric of society. 

For example, personalization in the auto industry would imply that each person will have a car that matches his/her own physical size, capabilities, and desires. The “fitting” would be done remotely with the buyer “test driving” the customized product on a simulated basis. Financing would be optimized to the person’s personal financial situation. To travel with another person, one would simply connect one’s “pod” to another’s through a standard interface. The destination would be programmed through a simple human interface. The pod would be routed to the destination much like a data file is routed across the interconnected networks that comprise the Internet. Hundreds of pages of documentation provided today in Operations Manuals would be replaced by user-friendly, self-diagnostic systems.  Congestion would be better load-managed through a series of computerized grids. On the other hand, where a consumer does not use (or understand how to use) a feature, the feature would not be installed thereby saving some costs. 

We know that technology will invade each sector of the consumer marketplace and strive to satisfy consumer demand to the maximum extent possible – which implies down to the individual person’s wants and desires. Large gains for consumers are achievable, but we must recognize that they will be stifled in some markets because of the market failures of the free market system in those markets.

Unless these market failures are addressed, future generations will not receive the full benefits of technological advancement in the key markets of housing, healthcare, and energy. 

While the future has the potential to be quite bright given the expected technology advances, a new set of leaders is needed to address the key market failures before real progress can be assured. 

Now let’s take a quick look at where technology has been and where it is headed. 


In 1961, Jack Kilby and Robert Noyce, two American electrical engineers, invented the silicon chip. Their creation revolutionized and miniaturized technology and paved the way for the development of ever-increasing computer processing capabilities and communications speeds. 

By 1965, George Moore coined Moore’s Law from his observations that the number of transistors on a microchip doubles every two years, while the cost of computers became halved in that timeframe. While not in effect any longer, the advances in – and adoption of – enhanced computer technologies under Moore’s law are impressive. After all, the law implies a 10x gain in 5 years and a 100x gain in 10 years. The result has been that more complex tasks can be accomplished at a lower cost, thereby bringing demand and supply closer together for an increasing number of computerized applications.

With computer capacity sitting unused on the desktop, the invisible hand of the market introduced an increasing number of new consumer applications, including a plethora of games which children of all walks of life love to play. Luckily, while the kids of all ages were playing Space Wars on their new computers, others were converting documents, files and communications to digital format. The days of using the telex machine and Morse code to communicate were quickly being numbered.

To accommodate the increasing amount of data available on the desktop, the communication carriers began laying fiber optics cables in the 1980s, thereby dramatically increasing the capacity and speed of moving data. 

In retrospect, we can now clearly see how various applications such as online shopping or social media used the Internet to take advantage of the technical advances in digital processing, packet switching and data transmission. In the other direction, services such as fax quickly became the exception rather than the rule, on their own paths to extinction.

The development of Artificial Intelligence (AI) software became a stalking horse for the evil of technology by showing machines replacing people in tasks from cutting hair to product assembly. But for our purposes, AI‘s greatest impact will come from its ability to decipher consumer preferences from repeated observations. We know that most people are more likely to buy when presented a personalized menu of goods or services. Hence, we can assume that the invisible hand of the market will use AI to dig down to determine individual consumer preferences. Companies who gather this information and satisfy these preferences will have the best chance of success.

While Moore’s law no longer rules, the underlying advancement in technical capabilities remains strong. For example, Moore’s Law is being replaced by Neven’s Law. Neven’s law is named after Hartmut Neven, the director of Google’s Quantum Artificial Intelligence Lab. Hartmut has stated that the growth in power with each new improvement to Google’s best quantum processor is growing at not just an exponential rate, like in Moore’s Law, but at a doubly-exponential rate. 

On a separate track, the FCC, by the late 1990’s, had freed up spectrum so that it could be used by digital mobile communications, aptly named Personal Communications Service (or PCS). The new digital technology came to be called 2G for second-generation. At that time, 2G networks could barely accommodate a video file at 2 Mbps. Today the carriers are building 5th generation networks which can send and receive files with average data rates at 100+ Megabits-per-second (Mbps). New applications are being developed to take advantage of these underlying capabilities.

Taken as a whole, the amount of computing capacity, the speed and reach of communications networks and the ability for software applications to understand and record individual consumer preferences are improving rapidly and will continue to improve significantly over time. These advances have produced gains in every consumer market. Going forward, the pace of change and corresponding impact will differ in each market, as discussed below.

As technology progresses, the value to cost ratio for many new features and services will improve and many will become economically viable. In some cases, the incremental cost of new services and/or features will approach zero. The question then becomes who will step up to satisfy the newly identified consumer demands using these latest technologies.


Consumer marketplaces change rapidly often because of technological change. To survive, companies must adjust. And they do – when they address consumers’ demands. Otherwise, they are toast.

This is not just a textbook theory. While working at the FCC, I was privileged to head up the analysis of the restrictions on cable TV companies to import distant signals to compete against the local affiliated stations of ABC, CBS and NBC – a powerful political and economic oligopoly. The broadcasters’ argument was that cable TV would steal their audiences and render them to be uneconomic, particularly in the provision of local news to the detriment of the public. On the other side of the ledger was Ted Turner and other entrepreneurs who wanted to provide nationwide programming. As it turned out, the Commissioners voted 4-3 to deregulate cable. The staff then authorized satellite carriage of distant video signals to improve the economics of thin, long-haul transport to cable headends around the country. In turn, Turner started Cable News Network (CNN) and a slew of other providers entered the video marketplace with niche programming. On the other side, no TV broadcaster went out of business as they emphasized their unique coverage of the local community and its events.

In contrast to this outcome, after the FCC, I went to work at MCI which was headed by Bill McGowan, who was the fiercest competitor I have ever met. McGowan’s entry into the communications industry earned him over 10 million paying customers and hundreds of millions of dollars in personal wealth. He attacked AT&T’s weak spot with a low-cost, voice-based, long-distance service. However, he recognized that the future resided with data communications and wireless communications – which did not recognize any artificial boundaries between local and long-distance services. After a few too many cigarettes and shots of bourbon, he died prematurely of a heart failure. MCI’s new leaders were not keen on the next generation of wireless technology – namely, Personal Communications Service. 

It is now easy to recognize that wireless communications service has become the backbone of modern-day consumer communications. But even back in 1994, MCI’s top brass were told that PCS would be a complete nationwide service that would enable one small, portable device using a personal number to provide call management features, email, messaging, voice mail, information services and video communications at a cost at which 94% of the public would subscribe. I know this because I told them, shortly before I resigned as the head of MCI’s PCS business unit (after seeing the writing of self-defeat on the proverbial MCI wall). As it turns out, MCI choose not to bid on a national combinatorial license that would encompass all local markets and missed getting any PCS spectrum. It was subsequently acquired by a much smaller company, and a few years later filed for bankruptcy. The company with the most dynamic position in the telecommunications industry in the 1980’s and that possessed an army of over 10,000 competitive zealots became an unknown entity less than ten years later. Its leaders had lost contact with the direction that its customers wanted to go.

To add to its woes, MCI had the unfortunate luck of having the CEO of the firm that acquired it be indicted for accounting fraud. Obviously, unfettered competition does not mean that the “rule of law” does not apply.

The Impact of Technological Change in the Communications Market

Consumers have relied upon four major technology platforms in the communications market; namely, copper wires for telephone service beginning in the 1870’s, over-the-air broadcast signals for radio and TV beginning in 1920, coaxial cable for Cable TV beginning in 1948, and operator-connected telephone car service for mobile telephone service beginning in 1946. These technologies developed independently, each serving different customer needs. 

Over time, advances in wireless and data capabilities changed the industry balance. Once the telecom carriers deployed fiber optic cables and interconnected their networks, data communications grew rapidly, and the Internet became a staple of everyday life. Today, we transmit roughly 33 zettabytes of data per year, and this number is growing by about 25% per year.

As the conversion to digital technology became more prevalent in the 1970’s, the European leaders agreed by 1982 to establish the Groupe Spécial Mobile (GSM) standards organization to develop a 2ndgeneration digital standard for mobile telephone service. Its participants realized that digital wireless technology could theoretically accommodate all forms of communications (namely, messaging, voice, data, and video) that previously had been delivered on different technology platforms.

The principles underlying the GSM standard became straightforward. The service offered by carriers would accommodate any device meeting the interface spec and would transport any form of communications provided in a standard digital format. By 1988, the detailed specifications were released and, by 1991, the first GSM customer went live. Ten years later, one billion customers had signed up for service.

While the GSM standard has been absorbed into newer versions of the wireless service, the personalization of digital cellular service fueled its growth to over 5 billion customers worldwide and is a model for other industries to follow. Consumers will continue to “cut the cord” if improved value is not provided by the providers of competing technologies. Certainly, that has occurred already with the younger generations.

The biggest technology fight left in the consumer communications market is the provision of high-quality video services. The long-standing method of using satellites to send TV channels to broadcasters’ headends is no longer economically justified. Also, as noted above, wireless communication providers have not yet earned their sea legs as high-quality video service providers.  The Internet is the most economic and ubiquitous transfer mechanism, but there is no control over routing. Therefore, the Internet Service Providers (ISPs) and the subset of “streaming video” providers cannot offer a guaranteed level of performance for video delivery. 

As of today, it appears that the best path to deliver broadcast-quality video is an overlay network on top of the Internet.  Although the end-game approach is not clear, the consumer can’t lose in this war. He or she will get much better video service at a lower cost. People will also be able to make, watch and/or participate in “TV” shows from home.

For the entire consumer communications market, the expected future technology changes will not be as great as for the current generation of users. Young people have already seen the future of communications technology and that is Personal Communications Service.  The changes going forward in mobile communications will be evolutionary, not revolutionary, certainly when compared to other markets and other technologies.

However, as 5G and future generations of digital mobile communications are deployed, these technology upgrades will have tremendous effects in other industries in providing more value to their customers, as discussed below.

The Impact of Technology on the Housing Market

Everybody loves to talk about the real estate market, in part because of its oversized booms and busts. However, in terms of actual changes, it has been stodgy compared to other consumer markets. 

The residential housing market consists of new house construction, home sales and rentals. It is supported by an array of activities such as procuring raw materials, real estate development, brokerage services and mortgage lending.

Looking at the average consumer’s monthly budget, housing is the largest item representing about one third of the expenditures. As importantly, housing represents approximately one third of net worth of the average American and the vast majority of consumers’ non-financial assets.

Two broad market trends have harmed the younger generation in this market. First, rents have gone up roughly 50% over the past five years, while incomes have increased only 5%. Second, the transactional costs of buying and selling homes has gone up to almost 10%, which has contributed to relocations dropping from 20% to about 10% per year.

Entrepreneurs have focused on using technology to address the industry’s relatively high transaction costs. First, the new entrants designed websites to provide much more information on the current inventory than previously available. Second, entrepreneurs used analytics software to provide buyers and sellers the ability to estimate the value of a home before it is put up for sale. Third, entrepreneurs gave real estate agents software to enable their operations to run more efficiently. As these new capabilities take root, transaction costs and the time to move to a new location will decrease significantly.  

Smart technology is also moving into the home. In the residential arena, that means connecting a device or devices in the home to an application on the Internet through a Wi-Fi connection. That doesn’t sound like much, but more than 33% of U.S. households already have at least one such device in use, with projections to more than double in the next five years. Moreover, the number of devices per household is expected to double for new applications once 5G digital wireless service rolls out. At a minimum, smart home technology will save electricity and reduce power and water bills. Even just connecting an underground irrigation system to the Internet may result in improved soil conditions and a nicer looking property simply from smarter watering.

Moreover, the Covid-19 virus, in addition to disrupting society, indirectly showed that many more job functions could be executed remotely given the expansion of broadband capacity that has penetrated the home. Of the 150+ million workers in the U.S., about 4.7 million worked remotely before Covid-19 hit: an increase of almost 1 million people over the previous five years.

The Covid-19 experience had the unintended effect of highlighting the need for Information Technology (IT) improvements in the business environment. It became clear that email was insufficient to track issues and that a ticketing system was required. Similarly, businesses were forced to think of automating workflows for teams, rather than individual assignments. Even one-on-one interactions between employees and HR, legal, etc. were streamlined for crisper and more complete resolution (e.g., by tracking issues with a closed loop system).

The results from the Covid-19 experience both from the employee and employer perspective should result in the number of at-home workers doubling in the next five years. This change could have the indirect effect of driving people to live further from the city and thereby somewhat flattening the price curve between urban, suburban, and rural properties and improving the quality of life.

Taken together, technological changes will come to the consumer housing market in sea of small revolutionary stages. The result for the next few generations will be a much better functioning housing market. 

Today’s youth will experience greater mobility, more geographic choices, lower transaction costs and more efficient homes than exist today. The remaining structural impairments to home ownership from excessive student debt and stagnant incomes (relative to rent increases) will have to be addressed by other means.

Impact of Technology Changes in the Transportation Industry

After communications, the market closest to the consumer’s heart is transportation. This industry also receives a lot of attention from the technologists.

In the U.S., 95 percent of the households own a car, and most Americans get to work by car (85 percent). The other modes of transfer are train and bus, and to a lesser extent, boats and planes.

In addition to the costs, consumers are interested in convenience, safety, pollution, travel conditions (roads, bridges) and energy consumption. Convenience is the major factor in choosing between the different modes of transport.

From a cost perspective, transportation represents the average household’s second largest monthly budget item at roughly 15%. The amortized cost of vehicle purchases represents about 40% of the monthly transportation budget; 22% on average goes to public transport and about 10% on insurance.

Busses are the biggest source of public transportation. Every bus rider has experienced the inconvenient payment options while trying to get on a bus, waiting with no visibility as to when the next bus will arrive and getting stuck in congestion because of lane closures from construction. 

Fortunately, there are technology solutions for each of these problems. Unfortunately, most of the inner-city bus routes area publicly owned and operated. No one after Adam Smith has found any invisible forces to satisfy the consumer’s needs in a not-for-profit or government-run market.

In the auto sector of the consumer transportation market, manufacturers finally added computer technology to cars after decades of procrastination. The current generation of new features and capabilities include GPS mapping technology, sway control, steering assist, emergency breaking, back-up cameras, etc. For example, V2V technology allows cars to continually communicate to the vehicles around them so each are aware of the others’ speed, heading and direction. Connected vehicles also help in recognizing and alerting drivers to dangerous situations. Over time, competition will ensure that these capabilities are included in the starter cars purchased by young people.

The next wave of technology advancements for autos will focus on personalization. Manufacturers are beginning to automate the process of setting up a vehicle, including entertainment options and application preferences based upon the individual buyer. The devices in the cars are connected to the Internet around the clock and can take voice commands. Therefore, features that address the most important customer requirement, that is, convenience, will become available, including setting up appointments, notifications of the car’s maintenance status, any malfunctions, recalls, tire pressure issues, safety hazards. etc.

Beyond these upgrades, future waves of change for automobiles are not far behind. For example, Tesla is focusing on making cars more fuel efficient with electric and hybrid technologies while maintaining high levels of customer satisfaction.

Many other entrepreneurial companies are focused on developing self-driving vehicles.

A Japanese firm and a Chinese start-up are trial-testing a pod-like vehicle that flies.

I could go on, but the point will still be that major parts of the consumer transportation industry will experience mind-boggling and positive changes. These changes will not only delight consumers’ preferences for transport – jointly and individually – but will help on external issues such as pollution as well.

However, this industry will not move along enough to keep up with the technology change and enable customers to reap its full benefits. In particular, large portions are the industry fall under the auspices of government control, including public transport, roads and bridges and enforcement of safety rules and regulations. 

With a political system that is in disarray, it is difficult to imagine a bi-partisan effort to improve public transportation and fund a rebuild of the nation’s road infrastructure. The next President will inherit a deficit on the path to $30 trillion and will have many other priorities for spending the taxpayer’s scarce dollar. Fiscal management by the federal government has been terrible for the past twenty years.

The Impact on Energy

Over the years, the energy industry has provided consumers power from steam, coal, oil, gas, wind, the atom, and the sun. Power consumption in the U.S. rises nowadays by about 3% per year, which has been counter-balanced recently by improving efficiencies. 

The industry is moving forward with technology focused on streamlining production, distribution, and usage. Advancements such as multi-well pad drilling, multiple fracture stages and improved well and pipe design have already boosted drilling efficiencies. Producers also are using fewer rigs to extract more oil and gas in less time, which keeps costs down. Further advancements promise to keep driving efficiencies in shale production. For example, scientists and engineers are studying how different types of rock fracture produce hydrocarbons and learning how to optimize drilling in shale formations through more precise well siting.

Despite these gains, the energy industry poses the greatest threats to future generations. First, the world population has grown from 1 billion in 1800 to almost 8 billion today, consuming significantly more energy along the way. It is not clear how long will the sources of the planet’s energy will last given wildly different assumptions for future conditions. Second, many uses of energy, particularly fossil fuels, produce the external effect of pollution. Competitive forces do not address externalities burdened upon consumers in other markets, and governments have been slow to address the issue. The result has been climate change, which is obvious to young people. Third, technological discoveries have enabled nuclear energy and nuclear weapons to be put into use – with potential devastating consequences particularly in a volatile political world.

While technology advancements take shape in this market, the nation needs to address the costs imposed on the next generations from the underlying threats.

The Impact on People’s Healthcare

The impact of technology as well as overall status of health care for young people is one of the most difficult areas to forecast. The potential and even the path forward for improvement are clear, but institutional hurdles – including the federal government – have resulted in unacceptable quality outcomes and significantly increasing costs.

Relative to the size of the economy, healthcare costs have increased over the past few decades, from 5 percent of gross domestic product (GDP) in 1960 to 18 percent in 2018. This translates into an expense of roughly $11,000 per person in the U.S. Despite these excessive expenditures, we do not stack up well against other countries with respect to longevity or quality of life.

The healthcare industry is one where the invisible hand of the market cannot lead to an efficient outcome because there is an information failure that precludes informed decisions, particularly for chronic diseases. The federal government has not addressed this issue. Rather, it has focused on providing the uninsured with insurance such as under the Affordable Care Act.  This approach helps those receiving coverage as well as the healthcare providers who get the government subsidies but doesn’t resolve the underlying issue of excessive costs.

Where the ACA addresses costs, it imposes rules such as limiting an insurance carrier’s administrative costs to 15%. However, the impact on level of costs from this rule is unclear.

One would think that the federal government would focus on addressing the excessive costs of treating chronic diseases such as cancer, heart disease and Alzheimer’s; after all, these diseases consume 60-70% of all healthcare costs. For example, developing cures would be extremely beneficial in improving health and lowering costs.

However, no one oversees or coordinates an industry-wide effort to lower costs for chronic diseases or find cures. In contrast, the Food and Drug Administration (FDA) is only responsible to ensure the safety of any drug reaching the market.

Given its charter, the FDA has introduced more and more regulations and requirements over time resulting in the average cost of a new drug coming to market to be over $1 billion. It has interfered with pre-clinical research (for example, by addressing embryonic stem cell research on mice) and even taken jurisdiction over personal genetic tests (which is an information service overviewed by the Federal Trade Commission). It takes twelve years on average to get the FDA’s approval and that doesn’t include pre-clinical involvement in the research.

The following three, real-world examples demonstrate the inflexibility and short sightlessness of the FDA’s approach. 

In 2012, I petitioned the FDA to treat BRCA1-related breast cancer patients (which results from a genetic mutation) differently than other patients getting chemotherapy for triple negative breast cancer. The FDA summarized my Petition as follows:

“You make the following arguments:

· There is little to no factual evidence to support the purported efficacy of ACT drugs in the triple-negative BRCA1 subgroup (Petition at 2).

· There is reason to believe that ACT treatment is disproportionately harmful to this subgroup (Petition at 2).

· There are alternatives readily available for this subgroup with better results in clinical studies.”

The FDA denied the Petition saying, among other things:

· “the prognostic significance of having a BRCA mutation is not clear”.

· “Although patients that have a BRCA mutation have a significantly higher risk of developing breast cancer compared to the general public, the outcome for a patient with a BRCA mutation following standard (i.e., ACT) chemotherapy compared to a patient without the BRCA mutation is not clear.”

· “It is not completely understood how similar BRCA1-related breast cancers are to non-BRCA1-related breast cancers.”

· “We do not agree with your claim that the results of any clinical trial of TNBC cannot be applied with statistical validity to the BRCA1 subcategory of TNBC unless that subcategory is specifically separated out and monitored”.

· “While patients with BRCA1 mutation status who received adjuvant chemotherapy may have had a greater incidence of death compared to those who did not receive chemotherapy (Petition at 4), the increased incidence of death cannot simply be attributed to the adjuvant chemotherapy itself”.

· “There are data from early trials suggesting that platinums may be beneficial in TNBC [sic, not BRCA1], but this benefit has not been confirmed in a randomized trial at this time”.

· “based on information available to us at this time” granting the Petition “is not appropriate”.

To put this inconsistent and non-sensical gobbledygook into perspective, 15,000 women die from BRCA1 breast cancer every year. The FDA doesn’t acknowledge the pain and suffering of these women. 

In contrast to the FDA’s approach, in 2012, Jon and his wife Mindy Gray funded the Basser Center for BRCA in honor of Mindy’s sister who died from a BRCA-related cancer. Unlike the FDA, the focus of the Center was to find a cure.

Led by Dr. Susan Domchek, the Basser Institute has made many breakthrough findings on BRCA-related cancers, including why PARP-inhibiting compounds vary so much in their clinical effectiveness. Basser also showed how PARP inhibitor drugs can be ‘tuned’ for better killing of tumor cells.

This organizational approach with specific resources assigned with a specific objective to a specific target is the correct way to address catastrophic illnesses. The federal government seems unlikely to take such an approach no matter what analysis or facts are presented. Moreover, the FDA will be putting up many roadblocks before Basser’s findings will be available to very sick women who need help to live. The FDA simply does not factor the cost of delay into its deliberations. Its charter is to ensure that unsafe drugs do not reach the market. 

Interestingly, the cost burdens and delays of new treatments effectuated by the FDA benefits the pharmaceutical companies by preventing competition to their existing product lines. These pharmaceutical companies spend about $100 million per year on lobbying. The FDA’s procedures enable politicians to cry foul whenever a new drug produces any harm. By the same token, there is no pressure on the FDA to improve the health and well-being of those with chronic illnesses by establishing goals for improvement with associated timelines. Such goals and timelines should be established and reviewed, and performance be rewarded as appropriate.

The second example addresses the Petition that I filed with the FDA in 2013 regarding metastatic cancer, which is responsible for over 90% of cancer deaths but receives only 8% of cancer research funding.

The Petition requested the FDA to: 1) create a new Center for the Treatment of Metastatic Cancer, 2) collapse the number of phases in metastatic cancer trials from three to two; and 3) champion for increased funding for metastatic cancer research.

The FDA denied this Petition saying:

1) Although many clinical development trials follow the three-phase framework, nothing in the FDA’s regulations requires three distinct phases.

2) The FDA already has a program to address serious or life-threatening conditions. 

3) The FDA has various other programs to facilitate and expedite development of new drugs.

4) The FDA has established the Office of Hematology and Oncology which has helped to advance the development of new cancer therapies, the majority of which are directed at metastatic malignancies. 

5) It is not appropriate for FDA to advocate for additional funding.

Not surprisingly, the percentage of cancer research dollars targeted towards metastatic cancer has decreased since this Petition was denied. On the other hand, over 3 million people have died from metastatic cancer over the same time period.

The third Petition that I filed asked the FDA in 2016 to establish a Notice of Proposed Rulemaking to receive comments on how to best implement the new 21st Century Cures Act which contained $4.8 billion in funding for precision medicines. Historically, researchers have had difficulty getting funding for precision drugs after the pre-clinical breakthrough because the target market tends to be small. Scientists refer to this phenomenon as the “Valley of Death” for their scientific discoveries.

This Petition was also rejected by the FDA. It said that the “Agency is committed to encouraging innovation” and that it approved 240 new drugs from 2013 to 2018, 76 of which could be considered precision drugs. It explained that the “FDA has also published many guidance documents on its website that may provide helpful information to sponsors of precision medicines as they work to obtain FDA approval”.

Inside the FDA, the view apparently is that every practice is optimized to perfection.

Outside of the FDA, researchers look at mortality rates as one indicator of the quality of healthcare. In this regard, over the past forty years, the mortality rate in the U.S. has dropped by 29%. However, the decline for other advanced countries has been 44%.

Similarly, the disease burden, which accounts for both premature death and years living with disability, is often measured using disability adjusted life years (DALYs). Although DALYs have declined in the U.S. since 2000, the U.S. continues to have higher age-adjusted rates than those of peer countries. In 2017, the DALYs rate was 31% higher in the U.S. than for comparable countries, on average.

In some areas such as the rates of medical, medication, and lab errors, the U.S. has the lowest performance (highest percentage of errors) of advanced nations.

The statistics above do not include the results for the Covid-19 

pandemic, where the U.S. will not fare well compared to other countries.

It is true that technology has brought about many welcome changes to the healthcare industry. For example, electronic records, remote consultations with specialists and the availability of intuitive mobile applications have led to improved patient care and a superior healthcare experience overall. Additionally, the availability of newer treatment technologies leading to better outcomes has enhanced the quality of life of the patients as well. GPS enables Life Alert service to save lives everyday.

Telemedicine has made it possible for patients to use telemedical devices to receive home care. The store-and-forward feature helps transmit biosignals, medical images, and other data to a specialist to facilitate asynchronous consultations (which don’t require both parties to remain present or online at the same time). This can significantly reduce waiting time for patients, speeding up treatment delivery processes.

But technology has so much more potential in the healthcare industry.

Taken as a whole, the inefficiencies of the healthcare industry will remain a large challenge for future generations, despite significant scientific progress. 

The Impact on Travel

Wireless technologies and smart mobile phones are making it easy to find different ways to travel. Consumers can easily determine transportation options to any location; check schedules; compare items like cost, speed, convenience and even carbon emissions; and then choose the best method for each trip. Moreover, transport carriers and destination locations are providing broadband and wireless capabilities for their customers.

Health-related impacts such as Covid-19 will be only temporary impairments; the extent and length of time being dependent on the efficacy of the response. Scientists have a near perfect record in developing vaccines for viruses. But timely and successful implementation takes a coordinated effort. The federal government needs to execute an effective leadership role in such circumstances apart from political distractions.

Assuming a modicum of competency on such issues, the travel industry will see much better times. The overall change in the cost of travel should not exceed the general rate of inflation for the foreseeable future. Additionally, travel destination options, the time devoted to travel and the enjoyment from travel should all improve going forward for the next generations.

An Extra Word About Market Failures

Adam Smith’s invisible hand works wonders for consumers in workably competitive markets. In the 4000+ markets that the government tracks, about 95% are workably competitive.

Markets do not produce optimal outcomes in cases of market failure. Market failures occur in three of the key markets addressed above.

In healthcare, there is a failure of information flow and therefore a lack of knowledge for making decisions, particularly for chronic illnesses. What are each of the providers of care for a particular chronic illness trying to accomplish? Is the research community coordinated on which avenues to pursue to avoid unnecessary overlap? Will the FDA fast-track this priority research? Will the insurance companies cover any drugs coming from those efforts? A standardized approach to the care of chronic illnesses would resolve these issues and substantially reduce the industry cost structure. 

The second market plagued by market failure is energy where the fossil fuels produce pollution ultimately leading to climate change. How can I be so sure? While in college I studied the effect of pollution and read the predictions. In retrospect, the predictions came shockingly close to what is happening now in both magnitude and frequency. Taxes on the pollution would have reduced consumption of the polluting energy source and helped to fund the any resulting clean-ups.

The third market affected by a market failure is the housing industry, given the indirect effect from reduced demand resulting from excessive student debt. Higher education has a series of market failures that are not internalized. First, having more people with advanced degrees has a positive external effect to the community as a whole, which is not reflected in the colleges’ tuition rates. Second, information is not complete in terms of being to determine the return on investment from different degrees. Third, it is near impossible for a student to assess and compare the level of quality and cost among various colleges and universities.

When market failure exists, it is the role of government to step in and adjust the supply and/or the demand parameters of the market to approximate a balanced equilibrium. 

Ideas for the Next Generation’s Leaders

Each of the challenges posed by the market failures described above can be overcome with effective leadership. Leadership requires a person with good judgment, a well-rounded education and high intelligence who can marshal resources towards achieving a specific objective for a specific target. The best place for such a person to lead an effort to overcome a national challenge is in the Executive Branch of the Federal Government with the verbal support of the sitting Administration. 

In dealing with student debt, can the Administration find a leader who is creative? For example, what services can students provide to get debt relief? State, local and federal government agencies are understaffed with Knowledge Workers. Can a program be put together that lets students reduce their debt by working for a few months (or part-time) for a government agency on Knowledge Management projects in the summer or after graduating from college? Can other programs be instituted in areas where the government is under-resourced such as with law enforcement or even with work on transportation systems?

Let’s not let future generations be burdened by these same market failures 20-30 years from now by failing to put the effort into fixing them now. Entrepreneurs get compensated for solving problems. What can be done to incent effective solutions to be implemented by government officials? If the expected improvements are not forthcoming with a new incentive structure, can the execution of strategic imperatives be outsourced to for-profit third parties?

Summary and Conclusion

The invisible hand of competition working in free markets has brought great wealth and prestige to the United States and its citizens. The largest contributor to its successes has been technology advancements with the invention of silicon chips being near the front of the pack.

While the pace of technical change may slow over the next twenty years compared to the last twenty, much of this slowdown will be caused by the market inefficiencies resulting from market failure. We have seen that any efficiency gains from technology can be quickly offset, for example, from a poorly executed response to a pandemic. The facts cannot be ignored for long without repercussions. 

The current generation of leaders from both political parties have failed to adequately address the market failures in the healthcare industry, the pricing by the institutes of higher learning and the components of the energy industry that cause pollution.

The solution requires new leaders to step forward and get buy-in for new approaches. Difficult? Yes. Impossible? No. Who will step up and address the biggest problems that the next generations will face? 

The pay-out to society will be great, and the rewards to the leaders who make it happen should reflect the successes.

Steven A. Zecola

September 29, 2020

# #  #