Skip to main content
Ecosystem News

Researchers Build a RISC-V Chip That Calculates in Posits, Boosting Accuracy for ML Workloads | Gareth Halfacree, Hackster.io

Designed as an alternative to floating-point numbers, posits may prove key to boosting machine learning performance.

A team of scientists at the Complutense University of Madrid has developed the first processor core to calculate in posits, a novel number representation designed as an alternative to floating-point arithmetic — and offering orders-of-magnitude improvements in accuracy.

“In this work, we present PERCIVAL, an application-level posit RISC-V core based on CVA6 that can execute all posit instructions, including the quire fused operations,” the team explains of its progress in using posits for real-world computing. “This solves the obstacle encountered by previous works, which only included partial posit support or which had to emulate posits in software.”

Read the full article.