Powering Big Data - A Systems Approach to Future Computing Platforms

Strategic Research Initiatives

Robert Pilawa-Podgurski, Philip Krein, Yi Lu, Naresh Shanbhag: Electrical and Computer Engineering

Roy Campbell: Computer Science 

Addressing the Problem

Modern computing has hit a “power wall,” with thermal and energy limits imposing severe constraints on technology advances. Energy use in the nation’s information technology (IT) infrastructure is growing rapidly, and already represents several percent of national electricity consumption. IT systems are trying to migrate towards greater energy efficiency (“green systems”) and robustness, but conventional technology and design methods are reaching fundamental limits that impede progress. Energy frugality and miniaturization are critical in portable devices, and frugal use of energy is becoming important from laptop computers all the way up to massive data centers. Over its lifetime, the biggest cost of a datacenter is the electricity, making energy efficient computing a top priority as big data processing continues to grow. The problem is to combine design requirements and practices in parallel computing and software systems, low-power digital circuits, and power electronics to achieve extreme leaps in computational energy efficiency.

Research Goals

The overall goal is to combine design requirements and practices in parallel computing and software systems, low-power digital circuits, and power electronics to achieve extreme leaps in computational energy efficiency. This work will circumvent the power wall and realize huge potential for improved energy efficiency in IT and computing applications. The research team intends to exploit new technology that sets up new system-oriented design approaches to computing, in which power conversion, digital hardware, and software become equal partners to manage both performance objectives and energy consumption.