Michigan Engineering News

An artists rendering of a bomb

A matter of time

How the Internet of Things infiltrated one home, and what it could signal about the future of privacy and security.

If I’d been home, this would have been obvious to me. But I wasn’t, and it was the middle of 2014’s Midwest “polar vortex.” I opened up my thermostat’s smartphone app and saw 32°F in big white numerals. Cursing, I phoned the man who sent the email.

He wasn’t a friend or neighbor. Rather, he was a heating and cooling dealer who had installed our furnace just a few months before, and he reached my house well before I did. It turned out that the unusual, thigh-high snow in our backyard had drifted over the intake and exhaust pipes of our new high efficiency heating.

I didn’t have a key hidden anywhere, but he didn’t need one. After he had shoveled the snow away from the pipes, I turned the furnace off and back on again through the app. Once I’d given him our Wi-Fi password, he checked that the furnace was indeed running and made his way home. The house was well on its way to a habitable temperature by the time I arrived. No frozen pipes.

Okay, so I was only about two hours away by the time the house got down to the freezing point, so I probably didn’t need the emergency service. But if that had happened the day before, when I had an 8,000-mile journey ahead of me? Well, I could have counted myself extremely lucky that my furnace was sending out distress signals to the installer (and theoretically me too, but it looks like those went directly to the spam folder).

Still, if the installer knows what’s happening with our furnace, then who else does – and what does it mean for the expanding world of smart devices?

Read a few privacy policies and it quickly becomes clear that they are (no surprise) typically geared toward protecting the company rather than providing assurances to the consumer. Some describe targeted advertising schemes that users are automatically opted into, allowing the company to share the user’s personal information with marketers.

The policy for my thermostat washed the manufacturers hands of any duty to protect my security, acknowledging that home networks are typically not secure. As for privacy, it did at least promise not to sell my data to anyone. And the reason the installer couldn’t reset the thermostat? He and the manufacturer have view-only access – theoretically at least, they can’t change the settings.

With Google’s data-hungry business model, you might expect that the company’s smart home brand Nest would be feeding user information to the search giant. However, Nest was conscientious about customer data before the Google takeover, and it promises to stay that way. In fact, Nest says its data stays on dedicated servers – it doesn’t rub shoulders with the rest of Google’s information holdings.

So, will the smart device companies of the future respect our privacy or treat our data however they please? Hard to say. It could even be that the consumers, rather than the companies, eventually own user data.

Imagine using a table saw in a shop class where only 27 percent of the students take safety training. Security education is playing catch-up.”

Kevin Fu, associate professor of computer science and engineering

“We’re sort of at a tipping point for what the Internet of Things is,” said Erik Hofer, chief information officer and clinical assistant professor of information at U-M’s School of Information. “We’re challenged to appropriately balance the need to protect individual privacy with the huge potential that exists for innovation.”

Prabal Dutta, a Morris Wellman Faculty Development Associate Professor of Computer Science and Engineering at U-M, agrees that connected devices will soon be all over in our everyday lives. “This is even more Orwellian than Orwell predicted. It’s incredibly invasive. Then you layer on the other things – NSA surveillance,” he said. “Really, privacy is dead.”

And if privacy is dead, security is in the ICU. “It’s the wild west,” said Kevin Fu, an associate professor of computer science and engineering at U-M. “My students find that many Internet of Things products in shiny boxes with slick marketing merely pay lip service to security and privacy.

“But we’re not saints either. Over 1,500 U-M students take our programming course each year, but only 400 take our security class. Imagine using a table saw in a shop class where only 27 percent of the students take safety training. Security education is playing catch-up.”

Vulnerabilities are already compromising even modestly connected homes, as a PhD student demonstrated when he investigated my home network. Fu predicts that in ten years, we’ll be cleaning up the mess we’re making today, closing all the security holes that opened in this explosion of connected devices. Because the Internet of Things isn’t the future. It describes the Internet today.

And let’s face it – we haven’t really thought it through. At least, I hadn’t when I said, “A thermostat we can set while we’re away? Sounds good!”

Rise of the machines

Originally, the phrase “Internet of Things” referred to the point when “things” connected to the Internet, such as sensors, alarm systems and automated devices, outnumbered people on our computers, tablets and smart phones. People are thought to have become the minority sometime in the last five years. Now, it’s often used to describe the connected things.

These devices run the gamut from activity trackers to smart ovens to glucose sensors and insulin pumps. A century ago, humans were wiring our homes with electricity. Now we’re computerizing them with automated light bulbs and thermostats connected to motion sensors, deadbolts that can be unlocked remotely, refrigerators that keep track of our food, washers that know when to order more detergent and security cameras we can view from anywhere.

As we incorporate these gadgets into our lives, we should be asking questions. What information are we giving away? What can be done with it? Are the companies taking more than they ought to? Are the services enough of a return for it?

And on the security side, is this data adequately protected and how much more vulnerable to hacking do these new devices leave our home networks?

A home automated to the full extent available today may be able to tell when you get up, when you go to bed, how many people are in your house at a given time, what rooms they are in, what your diet is like, how often people in your household bathe or flush toilets, how often you wash your clothes or vacuum the floor, how much time you spend in front of the television and what you watch. Add that to the data collected by smartphone apps and activity trackers.

Invasive as that may seem, we’re quite accustomed to companies having access to our conversations through social networks, email, text messages and call logs. They stick cookies into our browsers to track where we go online and only recently have they begun to declare this when you arrive at the website (after the European Union passed regulations requiring that web users be notified about tracking).

Although a recent survey from the Pew Research Center shows that Americans claim to care deeply about privacy, we’re not very good at acting on it. We tend not to clear the caches on our web browsers very often or fork over a little extra cash for encrypted services.

So, honestly, would the level of data collection I just described motivate you to avoid linking up your life, or would it just give you a vague sense of unease as you enjoyed the convenience of your connected world? And would your feelings change if you had more control over your data?

An artist's rendition of a lock made from text

Data Utopia

Research from the Mobile Territorial Lab – a collaboration between the Massachusetts Institute of Technology (MIT) and the Italian telecom companies Telecom Italia and Telefonica – says they would.

“I think that more people are worried about the company doing something bad than making sure that they themselves are empowered to access and do something good with their own data,” said Hofer.

We’ve come to suspect bad behavior from companies that collect data about us, but there may be another way: retaining ownership of our data. That’s the concept behind the New Deal on Data, proposed by Alex Pentland, the Toshiba professor at MIT and director of the Mobile Territorial Lab.

“There’s a lot of stuff that’s really bogus right now,” he told Scott Berinato, a senior editor of the Harvard Business Review, in a Google Hangout. Specifically, he was talking about unwieldy terms and conditions documents and the demand that you agree to them in order to use a product that you have purchased.

Hofer would add the implicit consent clauses that allow service providers to change the terms and conditions without a new agreement with the consumers. And usually, those terms and conditions include the right to collect vaguely-specified data about users, with equally vague statements about what the company will do with that information.

Pentland would like to change that, and the Mobile Territorial Lab has spent the last few years developing and testing a potential solution in the Italian cities of Trento and Livorno. Rather than allowing companies to collect and store data on their users, the users would log their own data in a central repository. They would be able to see all of the data that has been collected, delete it as they see fit and grant companies access to specific parts of the repository. For instance, a map app needs to know my precise location to give me good directions, but for my purposes, my whereabouts are none of Facebook’s beeswax.

And access to user data would be revocable. It might be helpful if my map app knows my most frequent destinations so that I can check for traffic delays before setting off, but all that information would be borrowed from my data store. Verifiable logs would show that the map company didn’t make its own copies of any of the information I provided, with legislation in place to enforce it.

“On one hand, it has the potential to make uses of the data store auditable and gives people a good sense of what data about them says,” said Hofer.

But he pointed out that these data stores would be great targets for attackers. It would mean trading the risk of hackers obtaining slices of our personal data from multiple, variably guarded sources versus hackers intruding on a single, well-defended database and getting a full picture.

The data store would also be able to anonymize data before handing it over to a company. For instance, I’m willing to make my location data available on an anonymized basis for traffic purposes.

“Anonymization is difficult to guarantee,” Hofer warned. It wouldn’t be very difficult to figure that the location data is mine, given that my home address and work address are matters of public record. So the Mobile Territorial Lab proposed that the data store could analyze the data itself, providing answers to the map app’s traffic queries without sending off detailed user data.

Pentland envisions that data management would be a lot like banking, perhaps even with a sort of interest from the businesses to which users lend their data. “If some company is making money off of it, maybe you can too. It’s bringing people to the table,” he told Berinato.

And if their 150-user study in Trento is any indication, users are willing to share more data when the purpose and benefits are transparent. While companies may balk at the idea of paying for some data, the more complete picture of who the users are and what they need from their devices or services could be worth it, Pentland said.

DODGING DYSTOPIA

Still, the idea of mass data collection makes Dutta uncomfortable. He is concerned about a future in which insurance companies put devices on vehicles (I suppose this is before they are self-driving) and set rates based on how you handle your car. Or activity monitors and connected scales could indicate how well you are taking care of your health.

It might begin as an incentive program – already, some auto insurance companies offer a lower rate if they can track your GPS information – but eventually those who choose not to be monitored may be assumed to be high risk.

“Then you look around, and I’m building these sensors,” said Dutta, speaking of his menagerie of web-enabled devices to monitor air quality, electricity usage, bathroom scale readings and more. “It’s going to happen one way or the other, and if we can influence some of these things, I think it’s going to be better.”

Dutta and collaborators at U-M, Stanford and University of California, Berkeley are looking at ways to ensure that some user information never makes it to company servers. After all, a business can’t misuse data it doesn’t have.

For instance, a learning thermostat benefits from knowing when the door has been unlocked or locked, indicating entrances or exits. Yet instead of communicating directly, the data first goes to the lock’s server, is transferred to the thermostat’s server, and only then comes back to the thermostat itself, Dutta explained.

“There’s no reason why that trip to the cloud has to happen,” he said.

Or, in cases where the information does have to go to the cloud, say for interpretation of voice data, Dutta is looking at how to encrypt it so that the processing can be done on the encrypted data, and the correct answer comes out after decryption.

And then there’s the security of the devices themselves. Fu recounted stories of baby monitors hacked by people shouting obscenities, or security cameras that would show their footage to anyone who discovered their IP addresses.

He also mentioned the search engine Shodan, created to reveal devices that are connected to the web. It once uncovered a heating and ventilation system at a Google office in Australia. “If Google can’t get it right, good luck to the rest of us mortals,” said Fu.

In cars, an IP address without defenses can be downright dangerous, as Fiat Chrysler Automobiles learned the hard way when hackers took over a moving 2014 Jeep Cherokee earlier this year. Fiat Chrysler released a patch when the stunt was published in WIRED, but if malicious hackers had exploited the weakness, there might have been a case for negligence.

An artist's rendition of an eye made from text

An exercise in awareness

As a mortal recently awakened to the risks posed by connected devices, I called the customer service line for my Wi-Fi thermostat to ask about their security standards. The representative, who identified himself as Mark, said that while the signals were unprotected over Wi-Fi, “the Internet is encrypted.”

I was a bit puzzled by this as according to my meager understanding, encryption occurs before the signal is sent out. Rather than giving me more detail, he repeated this point several times. I began to suspect that he knew some of the terms but didn’t really know what he was talking about (and I can say this with some authority as I am often in this situation).

It seemed like a good time to get outside expertise. Michigan is home to master hacker Alex Halderman, a Morris Wellman Faculty Development Assistant Professor of Computer Science and Engineering, who is best known for fighting censorship and exposing weaknesses in electronic voting, traffic light systems, airport scanners and more. He did not have time to hack my thermostat.

However, Travis Finkenauer, a PhD student under Halderman’s tutelage, kindly took time out of his day to give it a go.

He quickly recognized that I did not actually want to be hacked, as that would perhaps break my stuff, but instead carried out a security assessment. Logged onto my home network, he started by sending out messages to any device that was listening.

Our Wi-Fi speaker, the media streaming device for our television, Wi-Fi printer, and various computers and phones answered.

“Your speaker doesn’t require any username or password,” he offered. “If I got you to open an email link while you were on your home network, I could own that speaker and use it to attack other devices on your network.”

A promising start. It never occurred to me that I should have a username and password on our speaker, or that a hacker could do anything worse than play “All About That Bass” on an endless loop. But among the many devices on my home network, the thermostat was slow to show itself. “Do you have a FitBit?” Finkenauer asked, having spotted a WiFi card of the model used by Nike.

“No, but our phones track how many steps we take,” I replied.

He turned back to his computer. Eventually, he explained that he could only see broadcast packets that go out to all the local devices. The thermostat was probably communicating only with the company server, so he could not see the traffic.

To intercept the signals, he set up his computer as a router. Minutes later, he said, “Oh, it received a weather update! Breezy, warm, partly sunny.”

No encryption, then.

“It’s just an HTTP connection – not HTTPS,” he added.

“Does this somehow get encrypted when it reaches the Internet?” I asked.

Finkenauer gave it enough thought to be polite. “No. It would be hard for anyone to pick up the individual packets, but the NSA could go into the backbone points (of the Internet) and read them.”

Or Comcast could read the packets. Or anyone clever enough to get into my home network.

Finkenauer explained that if he got control of the kitchen speaker, he could tell the thermostat to route its communications through the speaker. Then he could intercept messages between the server and thermostat. He could probably change the settings by sending a fake response from the server, pretending that I had updated the thermostat settings from my phone. (But he didn’t try this as it could potentially upset my thermostat.)

“The big one is the firmware updates,” said Finkenauer, but unfortunately no firmware updates were available for him to watch through his network. “If the firmware updates aren’t encrypted, which seems likely, then I could spoof one of those and have the thermostat do whatever I want, including attacking other devices.”

So I called customer service again and asked if the firmware updates were encrypted. Mark said yes. Since it was the same guy, I asked whether general communications between the thermostat and the server were encrypted. Yes. Which I know is not true.

Was he mistaken? Was he lying? I had nothing to gain from picking a fight, so I hung up. Another sign of the times, when neither consumers nor companies are wise to security.

So what can regular folk do?

Not a lot, said Fu. “Opting out doesn’t work because devices may be embedded in things you buy. You might get an Internet of Things device whether you like it or not,” he noted.

Still, it wouldn’t hurt to think about security before we buy our connected devices, rather than years after the fact (as I have done). Some companies encrypt their connections and servers, so even if someone gets access to the data, they can’t read it.

We can practice basic hygiene and make sure every device we connect has a username and password (won’t those be fun to keep track of). But our home networks won’t be well armored against attacks unless device makers get serious about security.

TOWARD A SECURE FUTURE

Companies know that the average home network isn’t secure, especially as it’s easier than ever to leave a device without a username and password. Yet, they treat the home network as if it is secure.

“Once you’re sort of beyond the wall of the house, you can attack anything you want. There’s not a good reason for us to have essentially no security once you’re inside,” said Dutta.

His collaboration with Stanford and Berkeley is also working on ways to verify communications among devices while respecting the low computing power available to the tiny computers. For instance, if two devices are in the same room, they could use shared knowledge, like how long it’s been since the last time the light turned on. Then a remote hacker would have to control a device to send fake signals from it – mere impersonation wouldn’t work.

Or, a trusted device on the network could carry out the processing needed for advanced encryption techniques.

But until better security is implemented, embedded computers must be treated with suspicion – particularly those with life-and-death responsibility, such as WiFi-enabled patient monitors in hospitals. Fu and his former postdoctoral researchers Denis Foo Kune and Ben Ransford founded a startup, Virta Labs, to commercialize a power monitor for connected devices. If an embedded computer is compromised and doing nefarious deeds, its power consumption will be higher than expected with normal operation.

If I got you to open an email link while you were on your home network, I could own that (device) and use it to attack other devices on your network.”

Travis Finkenauer, computer science and engineering graduate student

Dutta doesn’t think capitalism will be enough to bring built-in security to the Internet of Things – at least not before some kind of major breach.

“Here’s really the issue: privacy and security will not sell a gadget,” he said. “We can provide all the technologies, but ultimately, there’s a cost to adoption. And it’s not a significantly differentiating feature yet.”

He advises writing to Congress about the need for legislation. The Federal Trade Commission (FTC) is gearing up to propose regulations, but for now, it has only made recommendations.

In January 2015, the FTC released guidelines as part of a report about security and privacy in the Internet of Things. On the privacy side, it advised data minimization – not storing data unless it is necessary to provide the service and deleting data that is no longer needed. For instance, a learning thermostat may only need a few weeks worth of data to predict when you wake up, when you leave, when you return, and when you go to bed.

Or companies could anonymize data after a certain period of time. (Although, as Hofer explained, just because information doesn’t have a name attached doesn’t mean that name can’t be discovered.)

On the security side, the recommendations were fairly fundamental: build security into the design from the beginning, train employees about security, consider monitoring security threats and providing patches, consider ways to exclude unauthorized users.

And if companies have to be told to do these things, it’s easy to see why Dutta calls security a ticking time bomb.

Media Contact

Kate McAlpine

Research News Editor

Related Topics