As AI technology advances rapidly, applications such as generative AI are testing the limits of current digital hardware like CPUs and GPUs. This has led to a growing interest in analog hardware designed specifically for AI computation. Analog hardware, which adjusts semiconductor resistance based on external voltage or current and processes AI computation in parallel using a cross-point array structure, presents certain advantages over digital hardware, particularly for continuous data processing. However, it also faces challenges in meeting the diverse demands of computational learning and inference.
To overcome these challenges, the POSTECH research team focused on ECRAM, which controls electrical conductivity through ion movement and concentration. These devices differ from traditional semiconductor memory by using a three-terminal structure with separate paths for reading and writing data, enabling low-power operation.
The team successfully fabricated ECRAM devices using three-terminal semiconductors in a 64+ 64 array. Their experiments demonstrated that the hardware exhibited excellent electrical and switching characteristics, along with high yield and uniformity. The team also applied the Tiki-Taka algorithm, an advanced analog-based learning algorithm, to this high-yield hardware, achieving maximum accuracy in AI neural network training computations. Importantly, they highlighted the "weight retention" property of the hardware, which ensures efficient training without overloading artificial neural networks, indicating strong potential for commercialization.
This research is particularly significant as the largest previously reported array of ECRAM devices for storing and processing analog signals was 10+ 10. The team has now successfully implemented these devices on a much larger scale with varied characteristics for each device.
Professor Seyoung Kim of POSTECH commented, "By realizing large-scale arrays based on novel memory device technologies and developing analog-specific AI algorithms, we have identified the potential for AI computational performance and energy efficiency that far surpass current digital methods."
Research Report:Retention-aware zero-shifting technique for Tiki-Taka algorithm-based analog deep learning accelerator
Related Links
Pohang University of Science and Technology
Computer Chip Architecture, Technology and Manufacture
Nano Technology News From SpaceMart.com
Subscribe Free To Our Daily Newsletters |
Subscribe Free To Our Daily Newsletters |