Multiverse Computing SL, a startup with technology that reduces the hardware footprint of artificial intelligence models, is reportedly raising new capital.
Sources said Bloomberg today, the Spanish company is seeking 500 million euros, or approximately $594.4 million. The round should value Multiverse at 1.5 billion euros.
The report comes about eight months after the software maker raised $215 million from a consortium including Toshiba Corp., HP Tech Ventures and others. At least some of these investors will likely participate in the round hosted by Multiverse. Additionally, Bloomberg’s sources indicated that several new backers are expected to join.
Multiverse’s flagship product is a platform called CompactifAI that reduces the amount of infrastructure needed to run AI models. According to the company, the software can cut training times in half and speed up inference by 25%. Multiverse also promises to reduce the models’ storage footprint in the process.
An AI model considers multiple data points when making decisions. Some of these data points influence the processing workflow more than others. A model tasked with predicting a retailer’s annual revenue, for example, might place more weight on its historical sales figures than on the final performance of its competitors.
The components of the model that determine how much different data points influence a decision are called weights. They are stored in a mathematical structure called a weight matrix. According to Multiverse, its CompactifAI platform reduces the hardware requirements of AI models by transforming their weight matrices into tensor networks. They are mathematical objects most commonly used to study quantum mechanical phenomena.
The process of transforming weight matrices into tensor networks introduces errors into AI models. CompactifAI mitigates these errors by retraining a neural network after compressing it, a process Multiverse calls healing. The task can be completed relatively quickly and requires only a handful of graphics cards.
In a Article from May 2024Multiverse detailed that its researchers used CompactifAI to compress the Llama 2 7B language model. They combined the platform with a model optimization method called quantization. Multiverse claims to have successfully reduced Llama 2 7B’s memory footprint by 93% at the expense of a 3% drop in AI output accuracy.
The company offers CompactifAI as well as a platform called Singularity. It provides access to AI models that automate industry-specific use cases across several major verticals. One module can detect malfunctions in factory equipment, while another helps financial professionals make investment decisions. There is also a cybersecurity engine that detects malicious network traffic.
Multiverse claims its products are used by more than 100 organizations. The company’s installed base includes Allianz SE, Moody’s Corp. and several other large companies. Customers can install CompactifAI on their infrastructure or use an application programming interface to access popular open source AI models that have been compressed in advance.
Multiverse reportedly intends to close its funding round in the first half of the year.
Picture: Unsplash
Support our mission of keeping content open and free by engaging with the CUBE community. Join the trusted network of CUBE alumniwhere technology leaders connect, share information and create opportunities.
- Over 15 million viewers of theCUBE videosfueling conversations about AI, cloud, cybersecurity and more
- More than 11.4,000 CUBE alumni — Connect with 11,400+ technology and business leaders shaping the future through a unique network of trust.
About SiliconANGLE media
Founded by technology visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a vibrant ecosystem of industry-leading digital media brands that reach more than 15 million elite technology professionals. Our new proprietary theCUBE AI Video Cloud innovates audience engagement, leveraging the CUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.
