OmniOps Launches Bunyan: Saudi Arabia’s First Inference Platform with Local Compliance

3 min
OmniOps launched Bunyan, Saudi Arabia's inaugural "Inference as a Service" platform, to enhance AI infrastructure.
Designed for Saudi data sovereignty, it speeds up AI inference tasks, reducing energy use and latency.
CEO Mohammed Altassan claims Bunyan is transforming AI deployment with easy model hosting and top-notch hardware.
Compatible with cloud and on-premises data, it supports Saudi Arabia’s data and AI strategy across key sectors.
OmniOps, founded last year, aims to boost AI scalability for local businesses beyond just generative buzzwords.
No doubt, the pace of AI innovation in Saudi Arabia has been something to watch lately. On that note, OmniOps, one of the Kingdom’s rising stars in AI infrastructure, has just rolled out Bunyan (بنيان) – a bit of a mouthful, officially described as Saudi Arabia’s first “sovereign Inference as a Service” platform. The whole thing came after conversations with the Minister of Communications & Information Technology. What stands out here – at least for those of us at Arageek who are always in the thick of supporting regional startups – is that Bunyan is tailored to ensure data stays put within Saudi, meeting those strict sovereignty and compliance rules that can otherwise be a real faff for enterprise.
The platform is being trumpeted for its slick technical chops: it zips through AI “inference” tasks (that’s when computers apply what they’ve learned to new information, if you’re not already knee-deep in the lingo), and does so at twice the usual speed. Even better, energy usage is slashed by over half and latency’s down by a good 40%. I’ll be honest, I reckon many corporates will be chuffed to bits if those numbers hold up in the real world – especially when everyone’s counting the riyals on their next electricity bill.
Mohammed Altassan, who’s the founding CEO at OmniOps, says Bunyan is “revolutionising” the way organisations roll out AI solutions. Yes, I know, “revolutionising” gets bandied around a lot – but there’s no harm in a bit of ambition. Bunyan doesn't just let you host out-of-the-box AI models, you can also lob in your own and deploy them with barely a fuss. Plus, it all runs on top-notch kit: think NVIDIA and Groq, those names cropping up more and more in AI circles.
For anyone building tools like HR chatbots, document digests, or clever systems that sniff insights out of unstructured data, the platform promises a friendly interface and what they call “agentic workflows”—essentially, letting AI take on more of the legwork so teams can get value fast. I’m not a fan of hype for hype’s sake, but the fact that Bunyan is both GPU-agnostic and flexible with hosting (cloud and on-premises play nicely together) speaks volumes for local firms wanting to keep sensitive data right where it should be.
There’s a nod to big-picture strategy too: the platform aligns itself with Saudi’s National Strategy for Data and AI, targeting key sectors like government, energy, and healthcare. Getting that official tick of approval – and access to the latest AI tech, without the headache of cross-border data transfer – sounds, well… spot on.
A quick snapshot on OmniOps: founded just last year, they’re all in on providing the serious infrastructure backbone that lets businesses scale up their AI work, whether it’s on the edge, in the cloud, or tucked away on-premises. From what I’ve seen, they’re looking to move the needle beyond just generative AI buzzwords, aiming to allow Saudi companies to leapfrog the global AI race.
So, from my perch at Arageek, where energising and pushing MENA startups is more than just a tagline, seeing homegrown players like OmniOps step up the game reminds me why I got into the entrepreneurship beat in the first place – even if spelling “sovereignty” correctly is sometimes a challenge!
🚀 Got exciting news to share?
If you're a startup founder, VC, or PR agency with big updates—funding rounds, product launches 📢, or company milestones 🎉 — AraGeek English wants to hear from you!
✉️ Send Us Your Story 👇