Until relatively recently, consumers were often nagged to look for and download software updates. This is something that many of us didn't do, promptly, or often, at all. As a result, many people ran out-of-date, insecure software, leaving them unnecessarily vulnerable to cyber-attacks and computer viruses.
In an effort to get prompt security updates to as many consumers and businesses as possible, the software industry has largely shifted to a model of automatic updates. As a result, our phones, computers and Internet of Things devices (such as thermostats and smart TVs) now regularly call their makers to look for updates, which are then automatically downloaded and installed.
The transition to automatic updates has significantly improved the state of cyber-security. However, the existence of a mechanism to quietly deliver software onto phones and computers without the knowledge or consent of a user could be misused by criminals, hackers and nation states.
It is for that reason that tech companies have built in an additional security feature, known as "code signing," through which companies can certify the software updates they've created are authentic. Without a digital signature proving the authenticity of the software update, it cannot be installed. This code signing mechanism ensures that only Microsoft can deliver updates for Word, only Apple can distribute updates for iOS, and only Google can deliver updates for its Chrome browser.
Earlier this month, the American public learned that the Department of Justice had sought and obtained a court order forcing Apple to help it hack into the iPhone of Syed Rizwan Farook, one of the San Bernardino shooters. The court ordered Apple to create a new, special version of Apple's iOS operating system that bypasses several security features built into the company's operating system. The court also ordered Apple to sign the custom version of the software. Without this digital signature certifying the software's authenticity, the iPhone would refuse to run it.
Experts fear that the precedent that the government is seeking in this case - to be able to force Apple to sign code for the government - could allow the government to force other technology companies to sign surveillance software and then push it to individual users' devices, using the automatic update mechanisms that regularly look for and download new software.
If consumers fear that the software updates they receive from technology companies might secretly contain surveillance software from the FBI, many of them are likely to disable those automatic updates. And even if you aren't worried about the FBI spying on you, if enough other people are, you will still face increased threats from hackers, identity thieves and foreign governments.
(Also see: Apple vs. FBI: Is Your iPhone Safe?)
There are a lot of parallels between computer security and public health, and in many ways, software updates are like immunizations for our computers. Just as we want parents to get their children immunized, we want computers to receive regular software updates. Indeed, just as the decision by some parents to not vaccinate their children puts their entire community at risk, so too the decision to turn off automatic updates not only impacts the individual, but other users and organizations, as those vulnerable, infected users' computers will be used by hackers to target others.
The trust that Americans have placed in software companies is far too important to risk destroying to make it easier for the government to spy. And the precedent the government is seeking in this case will not just apply to Apple, but, in an age of Internet of Things, to the TVs, thermostats and other smart-devices with cameras and microphones we are inviting into our homes.
© 2016 The Washington Post