www.hostingjournalist.com - HostingJournalist.com
HostingJournalist.com V3.0 Has Been Launched. List Your Business and Start Publishing Today. Free 14 Day Trial. SHOW ME

NTT Develops Cache Random Function with Universities to Enhance CPU Security

PublishedAugust 16, 2023

News Summary

NTT, in partnership with two universities, developed SCARF, a cache random function to enhance CPU security and prevent cache attacks.


Join HostingJournalist Insider Today

NTT Develops Cache Random Function with Universities to Enhance CPU Security

To remove the vulnerability brought about by delay differences with the cache, which are generated in the event of acquiring and updating data between CPU memories, NTT Corporation has developed a dedicated cache random function in collaboration with the Research Institute of Electrical Communication, Tohoku University, and CASA (Cyber Security in the Age of Large-Scale Adversaries) at Ruhr University Bochum.

The development of a highly secure CPU that thwarts cache attacks and information leakage is made possible by this study.

By offering design guidelines for randomization of cache function that formed acceptable random function, NTT created and suggested a Secure Cache Randomization Function (SCARF) for randomization of cache index and articulated what sort of function is suitable for randomizing of cache index.

Key Points:

  • Designing a concrete function SCARF for cache index randomization
  • Modeling attackers to execute cache attacks
  • Realizing an effective and safe design theory against modeled attackers via the use of a modifiable block cipher

Context of the Study

By speeding subsequence references and storing utilized data close to the CPU, the current CPU implements cache memory to lessen the effect of the latency necessary to transmit data between CPU memories. Even yet, information that has been referenced once may be referenced quickly the following time, giving attackers access to it as well. Cache attacks are information-based attacks that create a genuine vulnerability for which remedies are required. A genuine danger with less requirements for attackers is acknowledged for contention-typed cache assaults, which arise from a cache scramble between the attack software and the target program, among other reasons.

One viable countermeasure for contention-based cache attacks is to randomize the cache index. Although it is believed that randomization makes it difficult for an attacker to exploit the cache by preventing them from figuring out the target's cache index that an address uses, it is unknown at what implementation level randomization can be achieved.

As an example, encryption using block ciphers, a subset of symmetric-key ciphers, is thought to be a potential random function. Block ciphers were first developed as a means of maintaining secrecy, nonetheless. Block ciphers, in particular, are overqualified for use in cache random functions when the output cannot be viewed since they are designed to be safe in a situation where an attacker can monitor and control all input and output.

Research Findings

To create a design attack model for a given cache randomization function that accurately represented the attacker's capabilities, NTT first looked into what an attacker might really do with a cache random function. In particular, NTT introduced a tweakable block cipher in place of a block cipher, along with a collision model that makes the corresponding input pair observable when part of the output collides. This model of Enc-then-Dec encrypts with the adjustment value t1 and then decrypts with the adjustment value t2. Among other things, the latter model is very compatible with current theories of symmetric-key cipher design, and with appropriate design, the latency may be cut in half when compared to traditional techniques. (Surface 1)

NTT suggested a particular cache random function (SCARF) in this study, which was created with the Enc-then-Dec approach. SCARF's architecture benefits from NTT's extensive background in symmetric-key encryption design. SCARF achieves nearly half the latency of 305.76 ps in the same environment, whereas previous low-latency block ciphers need a delay of 560 ~ 630 ps with 15 nm technology. By using design technologies that maximizes the Enc-then-Dec concept, this halving is accomplished.

Future Development

The random function cache SCARF is designed to work with a wide variety of modern cache designs. Certain architectures, however, are incompatible with SCAR. Expanding the scope of the SCARF framework is anticipated to support a more diverse array of architectural designs. NTT will keep modifying its specially designed encryption technology, which in a limited-use setting like this study performs noticeably better than general-purpose encryption techniques.

Paper Information

Federico Canale, Tim Güneysu, Gregor Leander, Jan Philipp Thoma, Yosuke Todo, Rei Ueno, ‘SCARF – A Low-Latency Block Cipher for Secure Cache-Randomization,’ Usenix Security 2023.

Reference

Contention-based cache attack mechanism - Assuming that the attacker wants to determine whether or not the attack target utilized address A. The attacker replaces the cache that contains the index with one from their address space, chooses an address from their physical address space that clashes with the cache index used by address A, and then waits for the victim to perform access. The attacker re-accesses the previously chosen address once the attack target has finished executing in order to gauge the response time. The reason for the delay is because address A's cache has taken the place of the target index's cache if the attack target employed address A. However, the attacker may get results quickly if the attack target does not employ address A. This would allow the attacker to get information about whether or not the victim utilized address A.








Follow HostingJournalist

Other Channels