Even as discussions around the adoption of artificial intelligence (AI) agents intensify, many companies are not paying enough attention to the parallel risks related to data security, integrity and costs. At the India AI Impact Summit 2026, Indian native AI transformation founder Arinox AI and agentic AI company KOGO unveiled what they describe as India’s first sovereign AI product: a cutting-edge system built around the concept of “AI in a box”.

With CommandCORE, Arinox AI and KOGO are betting on a counterintuitive future of AI: private, sovereign and physically compact. The system is designed to calculate locally, without using the Internet. They have partnerships with Nvidia and Qualcomm for its agent stack, the latest CommandCORE iteration runs on Nvidia hardware.
“The future of AI is private, at the enterprise level as well. You simply cannot leverage your intelligence. The only way for an organization to exponentially increase its own intelligence and learning is to keep AI private. It has to own AI,” says Raj K Gopalakrishnan, CEO and co-founder of KOGO AI, in a conversation with HT.
At its core, this “AI in a box” proposal is as much ideological as technical, pushing the conversation beyond large language models (LLMs) and GPUs. Organizations that use public foundational models not only process prompts, but also expose operational information. “Sensitive industries, when they share data with fundamental models and cloud-based AI services, also share information,” he adds.
Agentic AI deployments face dual threat perceptions of security and privacy. Information, Gopalakrishnan insists, changes everything. “The moment you provide context, you provide information.”
An AI Threat Landscape 2025 analysis by security platform HiddenLayer highlights that 88% of businesses are concerned about vulnerabilities introduced by third-party AI integrations, including widely used tools such as OpenAI’s ChatGPT, Microsoft Copilot and Google Gemini.
In August last year, an MIT report indicated that 95% of generative AI pilots in businesses had failed to take off, with privacy being a factor.
Idea and cost argument
There are four key layers to a private AI solution in a box. First up, custom hardware from Nvidia. Second, KOGO’s agentic operating system, which underpins an enterprise agent suite, has more than 500 connectors for enterprise workflows and leverages open source models for sovereign AI.
Variants include Nvidia’s Jetson Orin Edge Systems for field deployments, DGX Spark for compact on-premises development, and enterprise data center setups including Nvidia RTX Pro 6000 Blackwell Server Edition graphics.
“This box is designed to eliminate the complexities of hardware, software and application layers, which an enterprise would have to orchestrate independently. It will perform targeted workloads, repeatable tasks and can scale to large clusters for a complete workflow,” emphasizes Angad Ahluwalia, chief spokesperson of Arinox AI.
Scalability is achieved by connecting multiple units together. Companies can choose from three model configurations for now, with more iterations expected in the coming months, according to Ahluwalia. The price starts at ₹10 million.
CommandCORE’s small option can run a model between 1 billion and 7 billion parameters, which is ideal for businesses wanting to deploy a handful of agents for batch processing or even HR onboarding processes. The average model extends between 20 and 30 billion parameters, for complex agents with inference.
“As AI adoption grows in regulated and sensitive environments, organizations need accelerated computing platforms that can operate entirely on-premises and under strict security controls,” says Vishal Dhupar, Managing Director, Nvidia India.
“Very large clusters, equivalent to Nvidia’s DGX clusters based on the Grace Blackwell series, are enterprise-wide transformation engines,” explains Ahluwalia. For context, Nvidia’s documentation notes that two of these DGX units, when interconnected, handle models with up to 405 billion parameters.
Why is a private, secure, and local AI system important beyond a sovereignty argument?
For Gopalakrishnan, this response is also economical. He cites an example of commercial electric vehicle charging and battery swapping stations, each of which can generate up to 30TB of data daily. “If there are 1,000 stations owned by the same organization and they have to send all that data to the cloud, think about the cost,” he says.
The alternative is edge processing. “A small device placed in each station without the need for internet, they will probably only send 200 GB of data to a cloud for processing.” In other words, filter and process locally, transmit selectively, and reduce both bandwidth and cloud computing costs.
Arinox and KOGO hope to find traction particularly in sensitive sectors such as finance and banking, government services and defense.
