Startup secures $13M to better train deep learning models

Artificial intelligence startup Run:AI secured $13 million in funding this month for its high-tech training solution for deep learning models, the company announced April 3.

Run:AI, which is based out of Tel Aviv, Israel, created a high-performance compute virtualization layer for deep learning that speeds up the training of neural network models, according to a release. Right now, researchers typically train models by running deep learning workloads on a number of graphic processing units, which can run continuously for days to weeks on pricey computers.

“Traditional computing uses virtualization to help many users or processes share one physical resource efficiently,” Omri Geller, co-founder and CEO of Run:AI, said in the release. “Virtualization tries to be generous. But a deep learning workload is essentially selfish since it requires the opposite—it needs the full computing power of multiple physical resources for a single workload, without holding anything back.

“Traditional computing software just can’t satisfy the resource requirements for deep learning workloads.”

Run:AI’s software, on the other hand, creates a compute abstraction layer that automatically analyzes the computational characteristics of workloads, eliminating bottlenecks and optimizing workloads with graph-based parallel computing algorithms. It automatically allocates and runs workloads, making deep learning experiments run faster and lowering the costs associated with training AI. According to the company, its solution will enable the development of “huge” AI models.

Run:AI received $3 million from TLV Partners in its seed round and an additional $10 million in a Series A round led by Haim Sadger’s S Capital and TLV Partners.

“Executing deep neural network workloads across multiple machines is a constantly moving target, requiring recalculations for each model and iteration based on availability of resources,” Rona Segev-Gal, managing partner of TLV Partners, said in the release. “Run:AI determines the most efficient and cost-effective way to run a deep learning training workload, taking into account the network bandwidth, compute resources, cost, configurations and the data pipeline and size. We’ve seen many AI companies in recent years, but Omri, Ronen and Meir’s approach blew our mind.”

""

After graduating from Indiana University-Bloomington with a bachelor’s in journalism, Anicka joined TriMed’s Chicago team in 2017 covering cardiology. Close to her heart is long-form journalism, Pilot G-2 pens, dark chocolate and her dog Harper Lee.

Around the web

Compensation for heart specialists continues to climb. What does this say about cardiology as a whole? Could private equity's rising influence bring about change? We spoke to MedAxiom CEO Jerry Blackwell, MD, MBA, a veteran cardiologist himself, to learn more.

The American College of Cardiology has shared its perspective on new CMS payment policies, highlighting revenue concerns while providing key details for cardiologists and other cardiology professionals. 

As debate simmers over how best to regulate AI, experts continue to offer guidance on where to start, how to proceed and what to emphasize. A new resource models its recommendations on what its authors call the “SETO Loop.”