Welcome to the spookiest episode of On Edge yet! In this Halloween edition, Carlos and Alicja discuss all the possible tricks SiMa.ai’s Model SDK has to offer customers and the treat of deploying models at the embedded edge. Whether you are applying our tech to large-scale manufacturing, or just having some fun with a robot arm you built to collect candy, this episode is built to spec for the ML Developer.
After Carlos kicks off the episode with a bang (or rather, a pop), our hosts share their own Halloween costumes. See why Alicja took home the prize for nerdiest costume in the room (00:15). They then go into detail on how customers can utilize SiMa.ai’s “future proof” Model SDK to make their models run with high performance and maximum efficiency (2:09), and how SiMa.ai has used this SDK to target models with manufacturing customers using depth
cameras to perform this very episode’s application – robotic grasping (5:06).
Our hosts then take a short detour to discuss the news of the month. “Unsupervised human to robot motion retargeting via expressive latent space,” might be a bit of a tongue twister, but Carlos shares why this paper could have a huge impact on modern robotics (6:25). Alicja then discusses a recent venture capital newsletter that explains why success in the real-world application of LLMs requires larger context windows (8:44). Carlos shares the news that Intrinsic, an Alphabet robotics company, is looking to democratize the field of robotics with the launch of a new web-based developer environment, and the duo draw parallels to SiMa.ai’s Palette Edgematic (11:25).
Carlos and Alicja then introduce Dr. Ashok Sudarsanam, SiMa.ai’s Senior Director of MLA Software at SiMa.ai, to share a more detailed look at the Model SDK (18:50). Ashok lays out the primary challenges faced when porting different models (21:15), and the key features of Model SDK that help to overcome these roadblocks (21:50). The interview ends with a teaser of new features to come (25:31).
Then, the moment you’ve been waiting for – our hosts offer a live demo of the Model SDK in action! This episode’s code demo focuses on the advanced calibration and quantization techniques offered by the SDK, including how to load in a model from any framework (27:45), quantizing and evaluating it (34:14), and compiling it for deployment (47:35). Then, SiMa.ai’s Senior Field Applications Engineer Brian Oehlke shows how to evaluate the same example
model within Palette Edgematic, SiMa.ai’s low-code visual development platform (48:22).
Our hosts close out the show with a brief preview of the next episode, including more in-depth demos to come, and answer viewer questions from last month’s spotlight on object detection (52:35). And, in keeping with the Halloween theme, a few treats are even handed out to a pair of lucky viewers (54:05).
Do you have a topic or discussion idea for a future episode? Perhaps a question, or just a general comment? We’d love to hear from you. Submit your thoughts to Carlos and Alicja, either via email at OnEdge@sima.ai, or in the comments section on our YouTube and LinkedIn pages.