Three Vendors' Three Answers to the AI Pricing Issue: Ownership, Optimization, and Simplification
We're coming into the home stretch of the spring conference season, and two things have held true so far: Vendors can't stop talking about their agentic AI capabilities, and most vendors have had difficulty explaining their pricing models for all of this agentic AI stuff.
There is a lot of uncertainty around what this will cost companies using these AI capabilities and which challenges they can solve by using it, and vendors are trying to figure out how much horsepower they're going to need to cover customer usage needs, both now and in the future. For agentic AI to move from the talk stage to a scaled-up and accelerated deployment stage, the industry will have to figure out pricing models that companies can understand and feel confident deploying. Or, better yet, so they feel they cannot afford to wait to do it.
With that said, I have been able to speak with a few vendors executives who are not having the same difficulties talking about the pricing aspects of AI: Alan Trefler (founder and CEO of Pegasystems), Raju Vegesna (chief evangelist of Zoho), and Burley Kawasaki (chief product officer of Creatio). Each of them had interesting perspectives on pricing, which are represented in this short clip. Check out what they had to say about pricing and how their companies are handling it. Below the clip is an editied transcript of the conversation and an analysis of the transcript by Perplexity.
Edited transcript
Brent Leary: There's one subject that's been kind of difficult for most vendors to talk about with pricing around AI…
Raju Vegesna: It should not be priced separately; it should be included by default. The price for AI is $0 in most of the things that we have done.
AI is resource-hungry, and, as a result, vendors are wallet-hungry. But if you smartly allocate the resources, if you smartly designed the system, you don't have to be all hungry.
There are a lot of layers in the AI that are going to be a commodity. Models are racing toward commodity. Vendors are wallet-hungry because they don't own the resources. If you are running on a public cloud, you are being charged for the public cloud. Now, if you own your cloud, you don't have a problem, right?
So we can simply take the heat and keep delivering value to customers. This goes back to us doing our own data centers and all of that. We knew that this would be a nice long-term strategy for playing a long game. You would architect it a certain way, and now we are reaping the benefits of it because we own the infrastructure.
Now the chips are in our data centers. We are not paying someone else to run on someone else's infrastructure, so we can deal the profits and value directly to the customers. And the price of AI in all our applications at this point is pretty much zero.
Burley Kawasaki: Last year people were talking a lot about AI, and they were spending on pilots. But if you look at a lot of the data, there have been a number of studies on this, very few were making it out of pilots. And even fewer still were really delivering value. And to your point, a lot of it was generative, right?
It was actually adding tasks because a human had to know how to coach (or to prompt) the AI to do something useful. And it didn't guarantee ultimately that it was going to deliver an outcome.
So this year, as we're working with our clients, one of the key things we're trying to do is remove the friction from them to get to some new AI-enabled process.
One of the things fundamentally we made a decision on last year was we're not going to charge for AI, right? Because so much of the customer discussion with us was concern. It's like when they moved to the cloud a decade or whatever ago; they were worried if they moved to the cloud were they going to get a big bill. And the same concerns were loud and clear when it came to not knowing what their AI bill is going to be.
All the vendors had different confusing pricing models, right? "Am I really going to use it? If I don't use it I'm paying a lot for this technology that is really not going to deliver". And so we made a decision to not charge. Full stop.
Alan Trefler : We don't have the problem most of our competitors do because about five years ago we stopped pricing by user. And the reason we did that is we found that our systems were massively reducing the numbers of users. So we instead began pricing based on the work. What are the number of work items that is done in a month or a year, and a price based on quantity.
So we don't have a pricing structure problem because of the way we do this; where we do the design using the reasoning and the power of the LLMs.
When Pega Blueprint is running, that's chewing through tokens, right? That's just really doing logic, thinking of what the best way to do stuff. Putting it all together in Runtime, which is a factor of a thousand more [power consumption] usage than at design time.
If you think about the ratios in a typical sort of company, the workflows are literally 100 times more efficient than running the LLMs. And since they've been authored by the LLMs, you get the best of both worlds.
Perplexity analysis of the conversation
Summary of Vendors on Pricing AI
Key Themes:
- AI pricing models are evolving, with a trend toward including AI at no additional cost.
- Vendors are addressing customer concerns about unpredictable and confusing AI costs.
- Infrastructure ownership and efficient system design are crucial to cost management and value delivery.
Highlights from Each Speaker:
Raju Vagesna (Zoho):
- Argues that AI should not be priced separately, advocating for a $0 price point for AI features within Zoho products.
- Points out that AI is resource-intensive, making vendors eager for revenue ("wallet hungry"), especially when relying on public cloud infrastructure.
- Emphasizes that by owning their own data centers and infrastructure, Zoho can avoid external costs and pass on value directly to customers, keeping AI features essentially free[1].
Burley Kawasaki (Creatio):
- Notes that in the previous year, many organizations piloted AI but struggled to move beyond pilot stages or realize value, partly due to the need for human prompting and unclear outcomes.
- Creatio decided not to charge separately for AI, responding to customer fears of unpredictable billing, similar to early cloud adoption anxieties.
- Criticizes the confusing and varied pricing models across vendors and stresses that Creatio’s approach is to eliminate this friction for customers by making AI available at no extra cost.
Alan Trefler (Pegasystems):
- Explains Pegasystems shifted away from user-based pricing five years ago, instead charging based on the number of work items processed.
- This change was prompted by AI-driven efficiency, which reduced the number of users needed.
- Describes how Pegasystems leverages large language models (LLMs) for system design, which is resource-intensive, but the resulting workflows are much more efficient in production, offering cost and efficiency benefits to customers.
Conclusion
Major vendors are moving toward integrating AI features at no additional cost, driven by infrastructure strategies, customer concerns about unpredictable pricing, and the efficiency gains from AI-driven automation. Ownership of infrastructure and innovative pricing models are central to delivering value and managing the costs associated with AI deployment.