
Cisco CEO Chuck Robbins says 2026 is going to mark a turning point for enterprise and large-scale AI use. He led off today’s Cisco AI Summit, telling attendees that this will be the year of agentic applications and building greater trust and security into AI.
“There are lots of questions and discussions about what does it mean to your enterprise infrastructure? What does it mean to your security posture? What does it mean to application development cycles?” Robbins said.
“One thing that bothers us is trust, where there’s trust in what’s going to happen to your data, trust in the models, trust in your infrastructure, trust in the agents, trust in the partners that you’re working with – those are important issues that the industry needs to continue to address with AI going forward,” Robbins said.
AI obstacles
There are three main constraints that are holding AI back, added Jeetu Patel, Cisco’s president and chief product officer.
“The first one is that we have an infrastructure constraint. We just don’t have enough power, compute and network bandwidth. So that’s going to be an area that we are spending billions on at Cisco. And I think the industry is spending trillions in making sure that we can… go out and fulfill the needs of AI,” Patel said. “We’re working very hard to make sure that we can build out the critical infrastructure for AI.”
Patel pointed to Cisco’s P200 chip and the Cisco 8223 routing system announced last October that will be at the heart of building Cisco-networked AI clusters.
“The G200 chip was for the scale out, because what’s happening now is these models are getting bigger where they don’t just fit within a single data center. You don’t have enough power to just pull into a single data center,” Patel said. “So now you need to have data centers that might be hundreds of kilometers apart, that operate like an ultra-cluster that are coherent. And so that requires a completely different chip architecture to make sure that you have capabilities like deep buffering and so on and so forth… You need to make sure that these data centers can be scaled across physical boundaries.”
“In addition, we are reaching the physical limits of copper and optics, and coherent optics especially are going to be extremely important as we go start building out this data center infrastructure. So that’s an area that you’re starting to see a tremendous amount of progress being made,” Patel said.
The second constraint is the AI trust deficit, Patel said. “We currently need to make sure that these systems are trusted by the people that are using them, because if you don’t trust these systems, you’ll never use them,” Patel said.
“This is the first time that security is actually becoming a prerequisite for adoption. In the past, you always ask the question whether you want to be secure, or you want to be productive. And those were kind of needs that offset each other,” Patel said. “We need to make sure that we trust not just using AI for cyber defense, but we trust AI itself,” Patel said.
The third constraint is the notion of a data gap.
AI models get trained on human-generated data that’s publicly available on the Internet, but “we’re running out,” Patel said. “And what you’re starting to see happen is synthetic data is getting to be extremely potent in training these models.” The highest data growth is that of machine-generated data, he said. “As agents get more and more prolific, and as you have these agents working 24/7, you will see continued amounts of acceleration and exponential growth on machine-generated data. At Cisco, it turns out we are the center of all of this stuff,” Patel said.
Patel also noted how AI programming has altered some of the software development going on inside the company.
“It seemed like a far-fetched goal at the beginning of 2025, but now 70% of all AI products now being developed at Cisco are using code that’s generated by AI. I would say that within the year 2026, we will have at least close to a half a dozen products that’ll have 100% of the code written by AI rather than written by humans,” Patel said. “Humans will still have a very valuable role to play because they’re going to make sure that they’re writing specs and they’re making sure that they’re actually going out and reviewing the code. But the bottleneck is no longer going to be around the writing of the code activity. The bottleneck is going to be around the reading and reviewing of the code activity.”
