The speed limit for emerging technology
I talk to many people about AI and have noticed a trend. People from all walks of life, ages, and areas of expertise share a level of anxiety about the technology. Claims that AGI has already arrived, NVIDIA's stock price, and Boston Robotics…simply doing their jobs isn’t reducing anyone's discomfort. The Nobel Prize committee announced this year's winners, and I was surprised at the response, or rather the nature of the press coverage.
The most notable aspect of the 2024 Nobel Prizes was the recognition of AI-related work in Physics and Chemistry. Applying machine learning to accelerate outcomes in science reflects the profound impact of technology and recognizes its future potential.
AI has been accelerating scientific research for years, and the recognition of those using it is overdue. Achieving a positive outcome for mankind vs. ongoing waves of deepfake content provides us hope.
What struck me was that Geoffrey Hinton and John J. Hopfield were awarded the Nobel Prize in Physics “for foundational discoveries and inventions that enable machine learning with artificial neural networks.” The pair definitely had an outsized impact on the evolution of technology. We can only hope their impact will continue, albeit in a different form.
Dr. Geoffrey Hinton retired from Google in the Spring of 2023. He did so to speak freely about the dangers of AI and express regret about his contributions to the field. He has since called for the regulation of AI because of its potential for exploitation by bad actors. Hinton likely contemplated retirement starting about two years ago (if not earlier). AI has come such a long way in two years that it’s difficult to project its velocity two years from now.
Many in the technology industry saw the irony surrounding the Nobel Prize. The message of hope and excitement about the future is firmly grounded in the reality of our current state. One could easily read too much into a positive or negative interpretation, and knowing which position to take is hard.
The truth is that AI could be both our salvation and damnation.
Whether you are bullish or bearish on regulation or torn about what to do about AI, we are living in an exciting moment. Given this context, I am writing a multi-part series on Regulation, Compliance, and AI.
The first post will address human nature, regulation, and the cost of enforcement. The second entry will focus on examples of regulation in technology, citing patterns and anti-patterns. The series will conclude with potential scenarios and Gyroscope's point of view on AI, Regulation, and Business Value.
An Imperfect Metaphor
I am no stranger to regulation. I do not work for a regulatory agency, publish public policy, or prosecute (or defend) those who don't comply with laws. However, I have led design efforts for Risk and Compliance software for a few years of my career so I’m familiar with the space.
I think of regulatory compliance through the lens of human behavior. One example I apply is Driving. The speed limit is visible to all drivers and pedestrians, so we know the rules. In theory, all drivers know how fast they can and should drive and when that speed changes. Do we follow the speed limit? Sometimes. At least when the police are around.
We follow the rules to the extent that they are enforced and are motivated by the breakage cost.
We don't follow the rules because we are rewarded--in the case of driving, our reward is the privilege of driving. We pass a driving test to share the road with others and different types of vehicles. Driving tests are required for safety. People behind the wheel follow the same rules closely enough to maintain safety at scale. We obey norms, yet accidents happen. We all know someone who has been in a car accident or been in one ourselves. Why?
Additional factors include:
Does everyone have a valid driver's license? Nope.
Is everyone sober? Also No.
Does breaking each driving law carry the same cost? Absolutely not.
Is driving regulated? Absolutely.
Is this a perfect system? No, because enforcement isn't evenly applied.
From drivers to developers
Technology regulation provides all operators with a standard set of guidelines to follow. Guidelines maintain an equal playing field and prevent unfair competitive advantages. Ideally, companies that utilize the same technology follow the same rules.
But reality is more like driving. Some people operate with impunity and break whatever rules they want, whenever they want, until caught. Other people follow the rules with the most significant penalties and take their chances with the smaller items. Some people default to following the rules and don’t stray from them. Variance in abiding by rules is human nature, regardless of context.
People follow laws to uphold social contracts to the extent they can afford the reputational and/or economic cost of breaking them.
In the US, you can get pulled over with probable cause. A tail light may be out, or your registration may not be current. These are minor infractions that may result in a warning. A minor infraction may violate the law, but it isn't enough to drive urgency for most people. A minor infraction can be an indicator of additional issues, so law enforcement looks for problems, both big and small.
Now, apply the behavior of driving and following rules to how companies operate. Most companies comply with laws and regulations. Organizations that don't are assigned a combination of financial penalties and heightened scrutiny. Repeated illegal behavior results in a company shutting down and/or individuals going to jail.
On Enforcement
It is challenging for a governing body that writes laws to understand the nature of complex technology. The more technical the applied science, the more difficult it is for non-technical people to understand. Most people in the legal profession didn’t choose Law because they were passionate about technology.
Technology outpaces regulation for many reasons, not the least of which is technology’s rate of change and time to value. Technology moves fast. Regulation and the legal process?
Not so much. Technology may have changed when committees are formed and a formal process is underway. Is technology worth regulating if the law can't keep up with it? Can we afford not to regulate emerging technologies that pose a viable social risk?
My next posts will discuss prior approaches to regulating and deregulating industries and emerging technology, presenting patterns, anti-patterns, and some familiar scenarios. From the Home Run Race of 1998 to the Silk Road, British Navigation Acts, and the Prohibition Era, the posts will provide a foundation for a POV on how we might successfully regulate AI.