Practical Considerations of Right-to-Repair Legislation
2021-07-24 00:02:39 Author: research.nccgroup.com(查看原文) 阅读量:75 收藏

Background

For some time there has been a growing movement amongst consumers who wish to repair their own devices in a cost effective manner, motivated to reduce their expenses, and reduce e-waste. This is becoming ever more difficult to achieve as devices reach ever higher levels of complexity, and include more electronics and firmware. The difficulty is exacerbated when OEMs make the information, parts, and tools necessary for repairs only available to their own authorized repair centers, policies which have been described as predatory.

Fundamentally, much of the conflict appears to arise from the distinction between the distribution models of traditional goods and digital goods. Historically, if you buy an item like a car you are free to do with it what you like, including re-selling, repairing with original or after-market parts, and generally modifying as you see fit. For digital goods like software this is not typically the case. Instead the purchaser will only acquire a license to use the software rather than any ownership stake that would permit them the same capabilities as a traditional purchase. The line between these cases has now become very blurred as much of what we buy contains computers and software. So while you may own your modern car or smartphone, the many millions of lines of code within it you almost certainly do not.

There have been several prominent cases at the forefront of the debates that are worth mentioning for additional context.

  • John Deere, a popular manufacturer of farming equipment, has been in a battle against farmers who want tools and software to be made available so that they can repair their own machines without expensive service calls.
  • Apple similarly makes authorized repairs only available at their own (often inconveniently located) stores and has repeatedly lobbied against right to repair legislation. 
  • McDonald’s too is embroiled in a scandal where they’re preventing franchisees from using 3rd party diagnostic tools to keep the ice cream machines in working order, which is an expensive challenge preventing 5-16% of all McDonald’s restaurants from selling ice cream at any given time. 

Owner Perspective

The benefit to owners is somewhat obvious. Being able to repair their devices allows them to save money, to learn how the technology works, to innovate new uses for things, and gives them freedom of choice. The cost of repairs when using name brand repair shops has always been much higher than the independent shops. This is a long established truth most evident in the auto repair industry. We see this mirrored in other industries, as anyone who’s had to fix a broken iPhone can attest. The ability to modify a system has been a crucial part of the story of innovation. So many of our modern gadgets owe their existence to the improvements built upon the previous generations of technologies.

OEM Perspective

Depending on the industry and the device, repair services can account for a significant portion of the OEMs revenue. While Apple claims to actually lose money on iphone repairs, about half of the US automotive market’s $1.2 trillion revenue is generated from aftermarket part sales. Other industries and other OEMs will fall somewhere on this profitability spectrum. But while profit can be an important factor in restricting repair capabilities, it’s not the only one.

Most OEMs correctly take their responsibility for the security of their users very seriously. This is not only good business, it’s a legal requirement, and failing to protect the security of user devices can bring very serious penalties. To that end, OEMs implement a plethora of security features in their devices that are intended to lock out would-be attackers. Attackers come in many varieties, from the opportunistic, who scan the Internet for vulnerable systems, to the very targeted, who may attempt to compromise a device physically. In some cases, the user themselves is considered an attacker, as is the case for most entertainment devices. Here, the content providers are a stakeholder in the device security story and contractually require that their assets be protected from the users.

Finally, modern devices are rarely built and put to market without an extensive array of technology suppliers. Frequently an OEM has supply arrangements that are bound by highly restrictive contract terms which may prevent the public disclosure of the vendor’s intellectual property. This can include source code, reference schematics, datasheets, software tools, and other information that may be necessary for the effective repair of some types of defects. The OEM may simply not have the legal rights to release the required information.

US Federal Legislation

In the US, and around the world, various governments have proposed (largely unsuccessful) legislation to address the repair controversy. Various states have also attempted unsuccessfully to introduce similar legislation in the past. Most recently, a US presidential executive order was issued tasking the FTC with enacting new rules for OEMs in this regard. At about the same time, a new bill was introduced to congress with the same purpose, the text of which is now available as the Fair Repair Act. Over the coming days and weeks there will be a significant amount of ink given to interpreting the recently published text. The key highlights of the proposed legislation include the following:

  • (Section 2.a) Electronic OEMs must make documentation, parts, and tools available for owners and independent repair providers. This includes making firmware available for public download. 
  • (Section 5.3) Also included is “reporting output” which is not defined in the text. We assume this means logs or similar, which may contain sensitive information.
  • (Section 5.5) The OEM must make these materials available under timely and reasonable terms. Reasonable terms must not include substantial obligation or restrictions.
  • (Section 3) Enforcement will be delegated to the FTC, and any FTC actions and intervention will take priority over possible actions by lower levels of government.
  • (Section 4.1) Security related functions are explicitly not excluded.
  • (Section 4.2) If trade secrets need to be divulged to comply, then so be it.
  • The proposal explicitly does not apply to:
    • (Section 4.4) motor vehicles (a term not defined in the text, but from other documents we conclude that tractors and farm machinery ARE covered by the rules)
    • (Section 4.5) medical devices (already subject to FDA definitions and regulation).
  • (Section 5.4, 5.6) Embedded software is defined as “programmable instructions”, but this itself is vague and undefined and warrants further discussion.
  • (Section 5.12) Tools may include software.
  • (Section 6) If passed, this legislation takes effect in 60 days.

Implementation Thoughts

Device security is a balancing act. There are often trade-offs and compromises with usability, performance, cost, and of course repairability. Here we discuss some specific implications of the proposed legislation and how an OEM might alter their designs to comply.

Minimum Repairable Unit

The biggest question that remains unanswered in the legislation text, is what the minimum repairable unit should be. Swapping out an entire ECU module on a tractor is a straightforward repair for pretty much anyone with the right screwdriver. But what about deeper levels of granularity? Can single components within the ECU be replaced? What about fixes deep within a semiconductor device? What about software bugs? Obviously all of this can be fixed by the owners with the right tools, skills, and instructions, but how deep into the technology stack will the legislation apply? In the extreme, consider that a modern CPU can contain thousands of internal computing elements all of which contain firmware that theoretically contain bugs to be repaired. Will the legislation require releasing the details of every transistor in the chip? This uncertainty is likely driving a lot of the resistance we see from the OEMs.

A layman’s interpretation of the legislation is that the minimum repairable unit is below the PCB level. Schematics are explicitly included in the list of information OEMs must provide, and so one would assume that users may be permitted to diagnose which component on the PCB has failed and replace it. For now it seems that the OEMs are left to interpret where they will draw the line for what is and is not repairable, and security must now be part of that discussion more than ever.

Security Mechanisms

There are certainly security features within devices that are intended to prevent attackers from compromising the device, its data, or its users. Each security mechanism that could be an obstacle for repair needs to be evaluated on a case-by-case basis to determine the best course of action for compliance.

Section 4.1 is interesting in its vagueness. One would assume that this is intended to address cases where passwords or authentication tokens are needed in order to conduct the repairs and bring the device back to a functional state. But additionally we should assume this is intended to address cases where security functionality intended to protect the device from attackers may prevent certain repairs, as exemplified by the iPhone button replacement situation from 2016 that caused devices repaired with third party buttons to stop functioning.

As one specific example, secure devices often will use a secure device identity that is represented as a combination of the components within the device. Information from each security impacting component is collected and cryptographically combined to create a value that represents the sum of all the parts. This value is cryptographically signed by the OEM (using a signing server or signing oracle) at manufacturing time to ensure that it cannot be tampered with by an attacker seeking to subvert the security of one or more components that may be vital to the security of the overall device. This signed value is then used for things like device identity, encryption of secrets, and other foundational security functionality. Any material change in the makeup of the security related components of the device will invalidate this value or its signature, and thus be detectable (most likely manifesting as an early boot failure). To allow authorized replacement of a component in a system such as this requires that the original pairing operation be performed again, along with the signing step. Because the signature generation is a security sensitive operation, only authorized users would typically be granted such permissions in order to prevent abuse by attacking user devices, creating counterfeits, and laundering stolen devices. 

Under the proposed rules, such a signing oracle would need to be made available to owners and independent repair operators, which may then enable the very attacks that the system was designed to prevent. Section 4.1 clearly covers such functionality, which may result in a significant degradation of device security if implemented poorly.

A reasonable (but unfortunately uncommon) approach here is to perform a two stage authentication, where both the OEM and the owner are required to authorize the bypass of the security mechanism. This allows neither party to bypass without the other’s approval. The caveat here is that as an industry we have yet to find a reliable way to support owner authentication that does not allow (or even encourage) poor security practices such as default or weak passwords.

Secure Boot

Software and firmware is listed in the proposed legislation, but described simply as “programmable instructions”. It is unclear if this affects all programmable instructions within the device, all the way down to the many embedded ROMs within the silicon, or if the line will be drawn at the firmware (which is specifically mentioned). Furthermore, it is unclear if the intent is to allow the owner source code access such that they may fix software bugs, or merely binary images so they can reflash devices with the original OEM provided code. The latter is relatively common already, as any mature and responsible OEM is already making ongoing firmware updates available to provide security patches for connected devices. The former on the other hand, comes with additional complications. Most firmware integrity measures, such as secure boot, rely on cryptographic signatures applied to the firmware image. Verification of the signature at boot time prevents attackers from persistently compromising the device. Such features have been seen for more than 2 decades in smartphones, and is now considered a table-stakes defense for any modern connected device. The complication here is that the cryptographic signature is only useful if the private key remains secret. If owners are expected to be able to modify the firmware then the key will need to be shared, revealing it to attackers, and thus eliminating this important security defense entirely. It is unclear if Section 4.1 (security functions) and 4.2 (trade secrets) of the proposed legislation are intended to cover this particular case, but a conservative reading should probably assume they do. Two possible implementation solutions to this are:

  1. Provide unique signing keys for each individual device, and an authorization system that permits only the current owner of a system from accessing or using these private keys. This is a level of infrastructure burden many OEMs today are likely prepared for.
  2. Allow the owner to provide their own signing key. Emerging proposals exist (OCP, IBM) in some domains to allow the cryptographic transfer of ownership by replacing the signing key in a system with one of the owner’s choosing. Hardware support for this is rare, but does exist in some components. Unofficial partial support in Android also exists.

Neither solution above is commonly implemented today and would require extensive changes deep within the system, requiring new hardware and firmware to be designed and tested. Bound by dependencies in current hardware and semiconductor designs, it is doubtful that such changes could be rolled out in the proposed 6 month implementation deadline.

Authentication

The difficulty for OEMs here is best illustrated by a concrete example: All phones contain a unique identifier that allows the network to address them individually. For phones on 3GPP networks this is the International Mobile Equipment Identity (IMEI). The IMEI is programmed by the OEM during manufacturing, and is typically allocated on a per network carrier basis. Repurposing phones from one carrier to another (during repair, or manufacturing rework) may require replacing the IMEI, and such functionality is therefore a practical necessity that OEMs implement for their own internal use. But because it is illegal to alter the IMEI in some jurisdictions, this repair functionality is tightly restricted to only OEM authorized individuals. 

There are many examples of similar functionality where through regulation, contractual obligations, liability, safety, security, etc, certain privileged functionality cannot be exposed to users or owners. From this standpoint, the proposed legislation will necessarily create multiple tiers of authorized repair. Some OEMs implement this sort of granular authentication capability already, separating permissions for internal OEM development, internal manufacturing, external repair, and owner capabilities, but this is not a common feature. Implementing the core firmware functionality and the infrastructure necessary to support it is non-trivial, and unlikely to meet the proposed 60 day implementation deadline.

End of Life

End of life is a challenge as recent public cases highlight. A vulnerability in a Western Digital product is a case study of what happens when internet connected products continue to be used past EOL. SonicWall had a similar incident recently as well.

Attacker techniques and tools improve over time. As software ages, ever more vulnerabilities will be discovered in it. The standard practice is therefore to continually release updated software and firmware versions, preferably at a regular cadence. Automatic updates remove the need for owners to be involved in this vital (but tedious) maintenance, and is thus becoming a common feature in many devices.

In the WD case, they had previously declared the product model as EOL in 2015 and have provided no firmware updates to patch known vulnerabilities since that time. There were multiple critical vulnerabilities including remote root command injection and a remote unauthenticated factory reset function. The latter is being actively exploited in the wild to destroy customer data, leaving WD in an uncomfortable position. 

What happens when the product is no longer commercially viable for the OEM to support? It may still have decades of useful life, and thus support is clearly needed. For consumer devices the most common relationship model is that the customer pays a single up front cost, and the vendor provides binary-only firmware updates for free, for a time. Under this model, obviously it’s impractical for a company to support products in the field forever, and the customer is incapable of providing their own patches. This leaves a few options:

  1. Offer customers a support plan. Like an annual subscription fee that funds the ongoing maintenance of the product and it’s firmware. For customers who depend on the product, they may be willing to pay. This support concept is well established for enterprise products already.
  2. Disable remote features upon reaching a clearly communicated EOL date. While such an attack surface reduction is responsible to the safety of the internet as a whole, this is not likely to thrill individual customers who may be reliant on that functionality. This path was taken by Sonos which caused significant unrest among owners.
  3. Require owners and users to explicitly accept the risk of running obsolete technology in an unsafe unmaintained fashion. There is an obvious difficulty of properly communicating this to the owners in a way that let’s them make informed decisions (i.e. not all consumers are equally literate with technology).
  4. Turn it over to the community to allow maintenance to continue. Many companies open source their products when they are no longer commercially viable and this allows owners and hobbyists to continue the maintenance. This may require releasing firmware signing keys, and may require permissive licenses for any third party components. This assumes of course that the OEM still has all these artefacts, as maintaining digital information for decades poses its own challenges. Here Escrow solutions exist that can help.

Regardless of the chosen solution, clear communication to the owners of the expected EOL date of the product, and in particular the security expiration date, is a must.

Timelines

The proposed 60 day implementation is significantly out of sync with typical product development timelines. For most electronic devices, the typical product development cycle can take 6-12 months, and this assumes relatively simple iteration on previous designs. For products that contain more fundamental improvements, it can take years. For complex semiconductor products, development cycles can easily be 10 years or more. 

This discrepancy makes compliance on new products very challenging, considering the foundational nature of the changes that may be required to comply without adversely affecting the security of the users (by, say unintentionally releasing dangerous tools). Even conceptually simple tasks such as scrubbing firmware log statements to remove sensitive information can be quite time consuming.

Moreover, nothing in the proposal indicates that this is to be applicable to new products only. Implementation on existing products is assumed, and this may require even more drastic actions by the OEM (such as the severely unsafe option of releasing code signing keys to both owners and attackers).

The proposed timeline is therefore expected to generate significant pushback from the OEMs. A better approach would have been to phase in the changes over a number of years, and grandfather existing products which may be uneconomical or technically impossible to support.

Closing Thoughts

It is abundantly clear that the right to repair movement is here to stay. Having visible support from the current US federal administration gives it some real momentum that is unwise to ignore. We recommend that OEMs begin working on solutions to comply sooner rather than later, because even if the current proposal does not pass into law, there will be others, and eventually compliance in one form or another, in some jurisdiction, is likely to be required. Having a low-friction compliance plan will become ever more important in the years to come.


文章来源: https://research.nccgroup.com/2021/07/23/practical-considerations-of-right-to-repair-legislation/
如有侵权请联系:admin#unsafe.sh