EMERGING NEWS 

Apple and Google release phone technology to notify users of coronavirus exposure

We need to be deeply sceptical of any company, political party or scientist who gets close to the truth, because it’s usually unreliable

Apple and Google have just released their versions of Apple’s Safe Access app. That app works pretty well: it wakes you up in the morning and preamps all your settings, so you can’t use an iPhone to call 911 in an emergency. It also helps you to shut down features that may exacerbate health problems, such as Facebook’s notifications for low blood pressure that could trigger the medicine that is prescribed.

The trouble is, these alerts aren’t enough. Most of us know that not getting enough sleep is bad for us. But sleep deprivation is also linked to a variety of adverse health outcomes, including respiratory illness and death from cardiovascular disease.

We are already doing far too little to protect ourselves against the pandemic that is the pandemic of sleep deprivation. Yet your phone can do it better than any other machine or device.

Experts such as Harvard sleep researcher Nicholas Christakis talk about this every day: we should not be waking ourselves up unnecessarily. But these experts should be widely disseminating information about how best to avoid the consequences of sleep loss. Ideally, the right information would be more widely available, and less complex, than the alerts Apple and Google released yesterday.

If the authorities required devices to alert us whenever they detected changes in our breathing – the sensor near our nose – to ensure that we didn’t wake ourselves up, we would get the information we need. The industrial manufacturing process which makes all our wearables is powered by heat, which, in turn, generates vibration energy. This is the energy that tells your phone that you are breathing irregularly, and that you should take care not to wake yourself up.

Imagine how easily this could be done without any new laws, by simply adding the words “scraping” and “fibration”. Software would be designed to mean that if any sensor near your nose touches the metal casing, you shouldn’t pick up your phone. That sounds simple, but it isn’t; a lot of the sensors in your smartphone are temperature sensors, for example, and they won’t be able to detect friction on the casing. But that’s easy to fix, by making them vibration-sensitive instead.

Now, should we ask for that law? That’s not a sure thing, given that a law mandating vibration-sensitive sensors would probably also require expensive and undesirable changes to the guts of your phone. And, of course, this legislation could be overturned by the next president. But to be safe, we should always be looking for loopholes that can be exploited.

There is no point in applying some stringent safety measure to protect against a harm that doesn’t exist, but nevertheless works. Society has both a legal and moral duty to take responsibility for its actions, when the risk is low, while minimising its liability when the risk is high. That’s not as hard as it sounds. But that doesn’t mean we can always trust corporate or governmental credibility, particularly in the healthcare field.

The right person at Google should have been able to use the faint pulse on the screen – and accurate hand-waving – to tell that not one but two people had fallen out of a window and into a swimming pool. The right person at Apple should have been able to read that alert – which reminded everyone that they should stop drinking that water – and be able to read the message from someone who had already gotten out of the pool and seen the person on the left floating face down.

We should refuse to accept official promises that restrict our freedom – and leave us vulnerable – in order to encourage companies to do the work of government. That means withholding the information that they can provide, including in encrypted form. We know that our biggest sources of information – big tech companies – are bound by little more than their personalities. And they have a long record of almost invariably trying to hide the information that they can disclose. We need to be deeply sceptical of any company, political party or scientist who gets close to the truth, because it’s usually unreliable.

So, yeah: next time you see that pulsing icon somewhere on your phone, you can bet that it is meant to wake you up, no matter how much harm that entails.

Related posts

Leave a Comment