Watching a machine learning model at work can sometimes feel like magic, but in reality it’s just math. A lot of math. Nothing terribly fancy, mind you: statistics, probability theory, multivariate calculus, linear algebra and algorithms. OK, it’s a little fancy, but the point is that there’s a lot of math to do. A reasonably accurate statistical model of the weights and biases of its training data does not get calculated on the back of an envelope. It’s calculated on computer chips, and as the volume of data and demand for insights from that data grow, so does the demand for ever-faster silicon capable of crunching those numbers.
Subscribe to the Crunchbase Daily
At a certain level of computational complexity, regular central processing units don’t cut it. They’ll do the math just fine, but they’ll take a long time to do it. Graphics cards were designed for massively parallel computing operations, like rendering and driving the multiplying number of pixels on our ever-denser visual displays; and it just so happens that that embarrassingly parallel architecture is well-suited to doing machine learning math quickly.
But there are plenty of applications where running a bunch of graphics processing units (GPU) in a data center is not practical. Take an autonomous vehicle for example: It doesn’t make sense to pipe all the data streaming off the onboard cameras and LIDAR sensors up to a cloud service, wait for it to process, and then get piped back into a car’s onboard computer. At 60 miles and hour, that kind of latency could be lethal.
As the world becomes more data-driven and our tech uses inference to be more responsive, a new generation of computer chips is required to make all the math-magic happen. At a certain scale of computational complexity, or in situations where electrical consumption has to be kept to a minimum, GPUs don’t cut it either.
Headquartered in Tel Aviv, Israel, Hailo is one of several companies vying for its spot in the competitive market for specialized artificial intelligence chips built for computing on the edge: automotive applications, mobile devices, AI-augmented home devices and industrial use cases.
Today the company announced that it’s raised $60 million in Series B funding. The round was led by existing backers but saw participation from new strategic investors including the venture arm of robotics and automation company ABB, Japanese IT conglomerate NEC Corporation and London-based VC Latitude Ventures.
The company says that its new funding will “bolster the ongoing global rollout of its breakthrough Hailo-8 Deep Learning chip and to reach new markets and industries worldwide.”
Hailo says its chip is capable of up to 26 trillion operations per second while drawing less than 5 watts at full utilization. Hailo-8 supports popular machine learning frameworks like TensorFlow and Open Neural Network Exchange and meets several compliance standards for automotive applications.
According to the company, the chip’s “Structure-Defined Dataflow Architecture translates into higher performance, lower power, and minimal latency, enabling more privacy and better performance for smart devices operating at the edge, including partially autonomous vehicles, smart cameras, smartphones, drones and AR/VR platforms.”
“This immense vote of confidence from our new strategic and financial investors, along with existing ones, is a testimony to our breakthrough innovation and market potential,” said Orr Danon, CEO and co-founder of Hailo. “The new funding will help us expedite the deployment of new levels of edge computing capabilities in smart devices and intelligent industries around the world, including areas such as mobility, smart cities, industrial automation, smart retail and beyond.”
Since its inception in February 2017, the company has raised $88 million in total funding, inclusive of the round announced today. In January 2019, the company closed $8.5 million in Series A funding led by Chinese venture firm Glory Ventures. No additional details about the company’s revenue or valuation was disclosed.
Illustration: Li-Anne Dias