Alongside AI is emerging as a new challenge for memory, and Huabon Electric CUBE is designed to meet customer performance power consumption needs

Recently, the memory manufacturer Huabon, which has been driven by market memory prices and continuously promoted, pointed out that in the memory forum of Semicon Taiwan event, General Manager Chen Pei-yan pointed out that in the era of Edge AI, mem...


Recently, the memory manufacturer Huabon, which has been driven by market memory prices and continuously promoted, pointed out that in the memory forum of Semicon Taiwan event, General Manager Chen Pei-yan pointed out that in the era of Edge AI, memory innovation plays a key role. Therefore, as AI applications are expanding from the cloud to the edge device, this puts higher demands on the memory, especially in terms of performance, power consumption and delay.

Chen Peiyan said that with the rapid development of artificial intelligence, people's demand for AI is growing. Initially, AI computing was mostly done on the cloud, but due to the demand for real-time decisions, low latency and low power consumption, AI is pushing from the cloud to the edge devices. The application range of edge AI is wide, including consumer electronics, PCs, telecommunications, automobiles and future AI robots.

In traditional microcontroller (MCU) or microprocessor (MPU) systems, the computing power is limited and power consumption is not the main consideration. However, when AI inference capabilities are introduced into edge devices, especially at the ASIC and MPU levels, memory consumption increases significantly and power consumption may also be as high as 100 watts. Compared to cloud AI (such as GB200 uses multiple HBMs and CPUs that consume 480 GB of memory), edge AI needs to balance performance, power consumption and cost. The memory becomes important.

In addition, market trends show that hardware (including SOCs, low-power processors and AI accelerators) dominate the edge AI market, and is expected to grow five times in the next six to seven years, showing huge market potential. Therefore, in order to address these challenges, the memory architecture needs innovation, including L1 to L3 cache, followed by L4 cache (SRAM or CUBE), and then HBM, LPDDR and DDR. This can meet the power and cost requirements of edge AI, making 3D memory, stacking and advanced packaging technologies a solution.

Chen Pei-yan said that the CUBE (Customized Memory Solution, CMS) proposed by Huabon Electric is designed to meet the memory needs of the edge AI era. CUBE aims to provide high-performance, high-density and low-power memory solutions. CUBE’s core strengths include the ability to increase bandwidth from 7 GB per second to 3 TB per second in terms of ultra-high frequency width and density, and the ability to handle 30,000 tokens per second. In a specific application case, traditional LPDDR4 may require up to eight wafers, while CUBE solutions using Huabon Power may require only one to two wafers, providing up to 128 GB per second in a tiny size of 50 square millimeters. For applications requiring higher density and bandwidth, CUBE can provide up to 30 TB per second and 70 GB of density.

In addition, power consumption can be significantly reduced. Huabon Electric's CUBE is comparable to the traditional LPDDR4 solution and can significantly reduce power consumption. This is the key advantage of edge AI devices. Moreover, CUBE can not only be used as an ISP solution, replace multiple LPDDR4 chips, but also as a hybrid memory, working in conjunction with the GPU/CPU, as L4 cache and part of the working memory, greatly improving memory density.

Chen Pei-yan emphasized that CUBE, which has many advantages, relies on advanced manufacturing technologies, including Wafer-on-wafer stacking technology used by Huabon Electric, and chip-on-wafer solutions. These stacking technologies allow multiple layers of DRAM stacking to achieve higher density. In addition, CUBE uses silicon perforation (TSV) technology to establish a large number of interconnections between each wafer and wafer, with I/Os exceeding 600K, thus achieving ultra-high frequency width. In addition, Huabon Power also provides embedded silicon capacitors as an auxiliary tool, which can be embedded in CUBE internal or power design to optimize power supply.

Chen Pei-yan finally emphasized that with the continuous rapid progress of AI algorithms (such as Transformer, GPT-3, and GPT-5), the demand for CPU, GPU and memory performance has also increased. Huabon Power launched CUBE in 2023 with the goal of leading memory industry into the AI ​​era. Despite the amazing development speed of AI and the difficulty of accurately predicting route maps beyond 2025, the prospects for AI are bright and will greatly improve people's lives. In this case, CUBE will play an important role in edge AI solutions through its unique innovation. Many memory companies are also launching similar solutions, showing that the innovation direction represented by CUBE is common in the industry. Therefore, Huabang Power continues to provide products such as DDR4 to serve niche markets, while leading AI memory innovation through CUBE.



Recommend News