New-Tech Europe Magazine | April 2019

Using Hardware Emulation to Verify AI Designs

Jean-Marie Brunet, Sr. Director of Marketing, Mentor, a Siemens Business

You can’t turn around these days without seeing a reference to AI – even as a consumer. AI, or artificial intelligence, is hot due to the new machine-learning (ML) techniques that are evolving daily. It’s often cited as one of the critical markets for electronics purveyors, but it’s not really a market: it’s a technology. And it’s quietly – or not so quietly – moving into many, many markets. Some of those markets include safety-critical uses, meaning that life and limb can depend on how well it works. AI is incredibly important, but it differs from many other important technologies in how it’s verified. Three Key Requirements AI/ML verification brings with it three key needs: determinism, scalability, and virtualization. These aren’t uncommon hardware emulation requirements, but many other technologies require only

two out of those three. AI is the perfect storm that needs all three. ML involves the creation of a model duringwhat is called the “training phase” – at least in its supervised version. That model is then implemented in a device or in the cloud for inference, where the trained model is put to use in an application. The training is very sensitive. From a vast set of training examples, you’ll derive a model. Change the order of the training samples by even one, and you’ll get a different model. That different model may work just fine – that’s one of the things about ML; there are many correct solutions. Each may arrive at the same answer, but the path there will be different. AI training techniques include ways of ensuring that your model isn’t biased towards one training set, but the techniques all involve a repeatable set of steps and patterns for consistent results. Because

you can’t verify a model that keeps changing. Likewise, during verification, the test input patterns must remain consistent from run to run. If you try to take, for example, random internet data from a network using in-circuit emulation (ICE) techniques for use in testing AI models in a networking application, you’re never going to completely converge across design iterations, since you can’t compare results from run to run. This drives the need for determinism. AI models themselves involve large numbers of small computations, typically performed on a large array of small computing engines. Their data requirements are different from those of many other applications, changing the way storage is built and accessed. And computation may be done in a cluster for a given model, but an application may have many such models, resulting in an overall fragmented design.

20 l New-Tech Magazine Europe

Made with FlippingBook - professional solution for displaying marketing and sales documents online