-->
  • July 23, 2025
  • By Brent Leary, Managing Partner of CRM Essentials, Cofounder of PPN

Enterprise LLMs: The DIY vs. Vendor Dilemma - A Few Good Minutes Debate

Article Featured Image

Recently I invited a group some of my favorite industry folks and longtime influencers and thought leaders to join me on a special episode of A Few Good Minutes to discusses where the industry stands at the midyear point.  The folks who joined me are:

  • Sheryl Kingstone - VP, Customer Experience & Commerce; General Manager, Voice of Connected User Landscape at 451 Research
  • Esteban Kolsky - Chief Distiller and Board Advisor at Constellation Research
  • Jon Reed - Cofounder at diginomica

We touched on a number of issues including the $500 billion Stargate program and the President's upcoming speech on his AI strategy and what to expect, the industry's overall handling of the agentic AI rollout, and what to expect for the rest of the year.  But one thing that caught everybody's attention was last week's announcement from Zoho about the creation from scratch of their own LLM, which sparked a conversation around whether enterprise level corporations should look into building their own LLMs from scratch as well. 

Short Analysis via Gemini Pro

Jon Reed highlights the complexity and potential pitfalls for enterprises attempting to implement large language models (LLMs) on their own, often involving intricate architectures like RAG (Retrieval Augmented Generation) and agent tool calling. He notes that while sophisticated customers and vendors can succeed, the average customer may struggle without direct vendor support.

Brent Leary brought up Zoho's approach of hosting their own LLM and data centers. Esteban Kolsky strongly advocated for enterprises to host their own LLMs to ensure data quality and content. He argued it's more economically efficient than buying tokens and cited examples of organizations creating their own models, including Switzerland's national LLM. Kolsky believes relying on open models compromises quality and that while initial costs might be higher, maintenance is comparable to a knowledge base.

Jon Reed nuanced this by explaining that while Zoho built their LLM from scratch, most companies could use open-source technology, though he doesn't recommend building from scratch due to significant development resources. Sheryl Kingstone agreed with this recommendation.

Below is a clip of the conversation where the points and counterpoints were flying all over the place on the issue, followed by an edited transcript of the exchange.

You can check out the whole episode at https://youtube.com/live/9qploLR0nVU?feature=share

Edited transcript

Jon Reed: Whatever's going on with OpenAI and Anthropic and those AI platform companies, what enterprises are doing to get results from this and what tech vendors are doing is very different. It's a much more constrained architecture that includes the incorporation of customer-specific data through things like RAG and agent tool calling and things like that.
The downside to this architecture is it's rather complex to maintain and manage. And it involves you knowing when do we use this model? When do we downsize this model? When do we use a reasoning engine? When do we not? How do we iterate on this internally? And the problem with that is that customers can run into a lot of trouble if they try to do this on their.
And there's a bunch of stats predicting failures in these projects, and we've seen accumulated failures in these projects. You've already seen it. So that's one of the problems with the architecture we've arrived at.
If you go on YouTube right now and look up RAG implementation the chart of all the components is gonna give you a brain freeze. And that's what the vendors are successfully doing and that's what very sophisticated customers are successfully doing. But I believe the average customer is somewhat locked out of this unless they work with a vendor directly.

Brent Leary: Zoho just came out today and introduced their LLM. They run their own data centers. They build their own servers. They host everything internally. Is that the right approach from an enterprise perspective?

Esteban Kolsky: 100%

Sheryl Kingstone: But that's Zoho hosting their LLM not an enterprise hosting their LLM.

Esteban Kolsky: Everybody who's interested in the quality of the data and the content provided and produced by the LLM should host their own. It is not really that hard and actually much more economically efficient than buying tokens from the open market.
Switzerland created a LLM for the entire country that answers all the questions in the country. I have about a dozen cases from companies and organizations and institutions that have recently created their own their own language models for the same reason.
Even relying on an open model for anything you do is absolutely stupid and demeaning to the quality that you want to achieve. If you want to achieve good quality and good results create your own LLM. The initial cost may be a little more than you think it is but the maintenance is actually just the same as it costs you to maintain a knowledge base, if not better.

Jon Reed: There are some nuances here because you can build your own LLM with open source technology. Zoho actually built their own LLM from scratch but I don't recommend that for most folks. That requires significant development resources.
The thing that's really important about what Zoho has done here is that they know exactly what they train their model on, which you can't say using external models even if they're open source, because very few even of the open-source models disclose what they've been trained on.
That's really powerful to be able to say that to your customers. Having said that, Zoho is gonna continue to offer access to external models. That is sensible as well because even though I have been pretty critical of those models, they do spend an absolute ridiculous amount of money on development of those models; and they don't want to force customers onto their proprietary technology before it's ready.
It's a very sensible move Brent, and in general enterprise vendors have an obligation to be much more transparent on their enterprise AI architecture, where the data is coming from. And there's a lot of minute details, like what happens to your log files, that are actually tremendously important because you can walk around and say "They don't train on our data."
By default they're still taking all your log files. That's what they do. They're not training on it but they still have it.
Esteban Kolsky: Taking that one step further, have Zoho create a core enterprise LLM, offer it to the customers for a modicum for maintenance purposes, and then once we offer it to you as an instance shaded to be your own LLM. You can use it for whatever you need; an enterprise member with your data and nobody else's.


Sheryl Kingstone: That’s what we’re doing. That's what the model is. Everyone is taking that subset of it.

Esteban Kolsky: But there's a difference between being a public model and one that is offered to you for a price, that when you buy it. it's your own personal model that is just brought into your organization. And it only grows or extends the way that you that you want to do it.

Jon Reed: I really like that about Zoho's approach and they're not the only vendor doing some of these things, but Zoho's doing a lot of work on figuring out exactly what size model is appropriate for what size task. And the beauty of that is you become much more operationally efficient and you're not consuming excess energy using LLMs for things you don't need to use them for.

CRM Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues