The Internet of things (IoT) refers
to the system of interrelated computing devices, machines, and objects that
have unique identifiers and the ability to transfer data over a network.
Included in this definition is everything from smart cars, to smart homes, to
fitness watches, to baby monitors. The growth in the number of these devices
connected to the Internet has grown significantly in recent years to 25 billion
devices, and the FTC
estimates that this number will double to 50 billion devices by 2020.
Two of the major concerns with the IoT are the lack of
security and the privacy implications of all the data collected. First, because
of the ubiquity of the IoT, some argue that all
the vulnerabilities of the digital world have been introduced into our real
world. Examples of IoT devices getting hacked abound, with Sony’s PlayStation,
baby
monitors, cars,
and “Hello
Barbie” reportedly succumbing to attacks. Another recent example, although
attributed to an iOS glitch rather than a hack, is a British
smart thermostat system that had some customer’s thermostats stuck at 32 C
(about 90F).
These security vulnerabilities also affect privacy issues.
In the hack of TrendNet’s
live-camera feeds (used to monitor babies, patients in hospitals, offices,
banks, etc.), the hacker was able to post links on the Internet of over 700 IP
cameras. This exposed private areas and activities including surveillance on
babies in their cribs, children playing, and daily household activities. But
beyond the privacy implications of hacks, there is also the issue of what data
is being collected, how it is being used, and the protective measures in place.
One approach to minimizing privacy issues is to only
collect data related to the published goal of the device collecting the
data. For example, if a person were wearing a device to track their steps/miles
walked, this approach would say that there is no reason for the location data
to be collected. Similarly, some argue that in the area of smart cars, or
“vehicle-to-vehicle communication,” it is only necessary to know that a car
is swerving towards you, not whose car it is. Limiting the collection of
this information would reduce the chances that the car manufacturer knows
exactly where you are at any point in time.
As to the question of whether or not Congress should
regulate the IoT, I agree with the FTC
report recommending that broad Congressional
legislation is not appropriate at this time. The rationale cited for this by
the FTC is that it would stifle innovation, to that reason I would add that
given the range of devices in the IoT, Congressional legislation would be
inappropriate. I would instead argue that the security be left to each
industry, with the exception of the auto industry, as discussed below.
One industry where I would argue legislation is appropriate
would be the auto industry. There are significant safety
and privacy implications involved with a lack of security in “smart cars.”
The distinction I see with the auto industry is that there is the potential for
greater harm with automobiles. While both watches and cars may track locational
data, if a car is hacked in the middle of a major freeway during rush hour
there are likely to be many more tangible injuries. The importance of securing
vulnerabilities in the auto industry has been recognized by Senators Markey and
Blumenthal, who introduced the Security
and Privacy in Your Car Act of 2015. This bill would require that all
vehicles manufactured for sale in the US be equipped with “reasonable measures
to protect against hacking attacks.” Separate from this legislative response, the
auto industry has been working on solutions to vulnerabilities in their cars. The
USA Today reports that carmakers representing 98% of
automobiles on the market have joined a consortium that will enable the
sharing of information on cyber security measures without violating anti-trust
laws. I would argue that there needs to be a baseline standard for cyber
security in cars, and that this baseline will likely come from federal
regulation (similar to how emissions and
safety standards/ratings are set by the federal government).
With this exception of the auto industry, I think this
problem should be left in the hands of the industries. As more attention continues
to fall on the security of IoT devices, the incentive for companies to pay
attention to security and privacy has been increasing. Technology Research
company Gartner lists “IoT Security” at the top of their list of “Top
10 IoT Technologies of 2017 and 2018.” Private companies such as Symantec and Cisco
are already marketing “IoT security products.” Companies such as Fitbit are
publically addressing privacy concerns by disseminating information
detailing to users what data is being collected, how it is being used, how long
the data is saved, and how users may opt out of certain features. Fitbit is
just one example, but it is an encouraging recognition of the privacy interests
in the data being collected. The counter argument to this Fitbit example is
that no one actually goes to the webpage and that the government should require
Fitbit to put that privacy information on the package or device.
That is not to say that in the absence of regulating the
entire IoT there is no role for the FTC . The FTC has also already filed
complaints against TrendNet
and “tech giant” ASUSTeK
Computer, Inc. for advertising their products as being “secure” when that
is not the case. Both cases settled. In the ASUS
case the FTC focused on the fact that ASUS did not address security flaws
in a timely manner and did not notify customers of security vulnerabilities. The
FTC has also provided
some tips for companies in the area
of IoT security. They emphasize starting
with security, designing the product through customer’s eyes, making it easier for people to select the
safer option from the start, heeding security warnings, thinking through how you will let consumers
know about fixes, and learning lessons from other FTC cases.
I agree with everything Laura said. I also agree with her stance on FTC regulation--I think it is based on sound reasoning. However, I also see merit to having the FTC regulate broadly, across all industries. First of all, there would be uniformity of regulations across technology areas. Also, it would ensure that an impartial source, with nothing to gain or lose in that particular industry, is looking out for consumers. There may be a risk of "capture" if the industries are left to self-regulate.
ReplyDeleteI, too, agree with much of what Laura has written, with a notable exception. With regard to the FTC's recommendation on IoT legislation, they do recommend having a baseline federal privacy legislation, which they claim will help build consumer confidence in IoT devices, while noting the security risks. However, privacy and security are two different yet inextricably linked aspects of the same problem, such that compliance with one might account for a substantial step toward compliance with the other. Therefore, the recommendations to not legislate one aspect should not necessarily be extended to the other aspect. An example that could be emulated is the HIPAA privacy rule and security rule. See http://1.usa.gov/1LtSkvw and http://1.usa.gov/21cWxec . Whereas the privacy rule is not prescriptive and provides a more general "do the right thing via policy" approach, the security rule is far more specific and lays out "required" and "addressable" standards to ensure security. The implementation of the "addressable" standards is left up to each regulated entity, based on reasonableness and appropriateness, typically determined through risk assessment and risk tolerance. Applying these principles to the regulation of IoT, imposing certain standards like strong encryption of data at rest and while in transit could be prescribed by an "IoT Security Rule," whereas other aspects like the type and amount of data collected by these devices, controlling access to such data, etc. could be addressed by an "IoT Privacy Rule." Whereas IoT consumer devices are relatively new, these concepts evolved from existing industrial technologies like SCADA, which are fairly mature. See http://bit.ly/1QrTiNe . Many of the security issues with IoT devices also exist in their "industrial" counterparts, and whereas those devices have been mostly left alone, there is now an internet search engine that allows hackers to find insecure SCADA devices. See http://zd.net/24HtgwK Since SCADA devices control things like power grids and other critical infrastructure, one might imagine that the industry had self-regulated its way to better security. However, that has not been the case, and there is no analogous basis to conclude that IoT security will improve through self-regulation by that industry.
ReplyDeleteGreat post. With respect to your discussion on the automotive industry, it should not be forgotten that the automobile consortium you reference has already adopted a set of "Privacy Principles" to govern autonomous vehicles (see http://www.autoalliance.org/?objectid=865F3AC0-68FD-11E4-866D000C296BA163 and video lecture 15)and several states, including Utah, have already passed legislation addressing privacy concerns with connected cars.
ReplyDeleteI agree with Laure and the FTC report that broad congressional legislation is not appropriate at this point. I feel that along with stifling the tech industry, Congress would be playing a potential endless game of catch up. By the time congress passes a law the tech industry would be releasing the next advancement making the most recent regulation outdated before it takes effect. Of course this is playing off the old cliche that the law moves too slow to keep up with technology. I don't believe that all efforts are futile but I do think that attempts to regulate IoT will not be efficient at this time.
ReplyDeleteI worry about each regulated industry being in control on how they manage the IOT. Not just for technological privacy reasons by main concern is a social one.
ReplyDeleteIn Managing Privacy in the Internet of Things, Usmand Haque speaks about a common language the “interoperability,” of things. I think this brings up a fantastic point in regard to the fallibility of the the human language. Will the technological language be that of the white rich man? So many terms are culturally defined. Will the language be of a person in a foreign country? Where will this leave those that are not doing the programing? I have many concerns where a transhumanist society leave the human aspect far behind.
In addition, I have concerns with the IoT is technology being used to take the freedom away from people beyond what the situation calls for. When the power is not in the individual, but a machine the nuances of life fail to be taken into consideration—even though technology is encouraged to make those nuances easier. When a poor person’s paycheck is late or they own a small business and one month was particularly bad, will physical harm result from lack of payment? Will the power of technology go so far as to restrain someone--akin to debtor’s jail or house arrest? The notion that an unpaid car loan could compel a company to not just possess a person’s car, but to make a car inoperable is frightening. For instance, what if I was in a snow storm driving my children across country? There are so many questions which urge the future of humanity and technology to revisit old moral philosophy. Could we possibly be in the next industrial revolution where society benefits; where many suffer a day to day decline? What rights would the holder of a mortgage have? What if your mortgage went unpaid and your house automatically locked you out? The potential is vast in regard to the inhumane ways in which the power technology can contribute to the justification of the good of the whole.
Another fear I have, is a concern over car admission and registration. If lower quality car would have more privacy concerns simply because it lacks sophisticated updated technology. Are these laws that would regulate privacy protection in cars not guaranteed to uplift the rich; whilst making the poor more distraught? I would say so, where one may think that it is a great idea for companies to cooperate with one another, I fear it is only a way to punish the poor whilst using the excuse of “positive incentives.” Say a car’s mileage automatically communicates with the government in support of a clean air law. The government gives bonuses or maybe the car company communicates with a corporation on mileage driven. This leaves the question of what power will these companies have in communication with each other on such intimate levels. What about the poor person that can’t afford to live close to their work? The profound social issues are endless. Because the internet of things tends to intertwine making things more personal the potential power in the hands of the few must be evaluated early on.
I think as the IoT becomes more ubiquitous that there needs to be a greater push to increase the accountability of tech companies. For example, I would be in favor or modifying the duty of care imposed on tech companies to ensure that users' personal data is protected from hackers or other nefarious third-parties. Although this might drive up costs, there needs to be real accountability, other than FTC investigations.
ReplyDeleteI also agree with Laura that further regulations is not appropriate at this time. Given the IoT has the ability to make our lives more convenient and efficient, it would be unfortunate to see red-tape and regulations stifle the benefits created by these innovations. On the other hand, it would be unfortunate for privacy breaches to become more damaging and more wide spread before congress intervenes. That is why I would be in favor of legislation now that increases the duty of care for tech companies.