With the paradigm shift in computer systems towards ubiquitous computing, energy, together with performance, has become an important parameter to measure efficiency. Java is increasingly becoming the programming language of choice for applications expected to run in embedded and mobile environments. Java's platform independence and security features serve very well the needs of these environments, which expect the same application to run in a variety of environments in a secure manner. The devices used in these environments, for example hand-held computers, have a limited battery life. The needs to increase the period between recharges and to decrease the cooling costs provide the incentive to incorporate energy as an important parameter. The authors propose two implementation strategies for allocating objects that can significantly reduce the memory system energy consumption of Java applications. The first strategy uses a part of the on-chip memory resources as a local memory to achieve better performance than a cache-only architecture. The object allocation strategy is implemented using an annotation-based approach and shown to be effective in improving performance and reducing the memory system energy consumption. Specifically, energy consumption is reduced by up to 39% and cache miss rate by up to 31%. The second strategy is object co-location, which exploits the temporal locality already present in heap references to achieve better spatial locality. The object co-location is implemented in the garbage collector of a Java virtual machine. This reduces the cache miss rate by up to 47%, and a subsequent reduction in the memory energy consumption by up to 49% is observed.
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- Hardware and Architecture
- Computational Theory and Mathematics