Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Publishers who want to experiment with the use of generative AI technology to create products and functions such as creating chatbots and data analysis must evaluate which large language models best match the account.
And it turned out that one of the largest factors in these assessments is how easy it is to integrate LLM into their companies. Technical systems – for example, various apartments and content management platforms – According to interviews with three publishers. This often means selecting LLMS owned by companies with which they already have corporate technologies or license agreements on content.
For example, a spokesman at one publisher – who asked for remaining anonymous – told Digiday that their company is experimenting with a number of different LLMs and is primarily Using Openai models. Their company has an agreement to licensing content with Openai, which followed a successful project to build a chatbot using the OpenII GPT model. The publisher continued to use GPT for other needs such as productivity tools.
Another publishing executive who spoke about the state of anonymity said that they use the OpenAi for internal use instead of other LLMS, simply because they have an agreement on content with OpenIa.
And the third publishing house, which also applied for anonymity, said that their company uses LLM owned by companies such as Microsoft and Google, as it is already true for using their business software, including Microsoft Office 365 and Google Workspace.
This makes it “very easy to integrate into our full development ecosystem for us,” said the third publishing house. As a customer of Microsoft Office 365 products, the company can “integrate [Copilot] natively into the tools we already use, such as Outlook, Excel, PowerPoint [and] Word, ”they added.
Nate Landau, the main and technology director of TheSkimm, said that using technologies offered by companies with which they already have existing agreements means they do not have to create their own solutions.
“Some of our partners, as [data storage company] Snowflake, they offer their own AI integration, and we prefer those who are subject to buildings from scratch, ”he said.
And because TheSkimm has no exclusivity with the technology company AI, they have flexibility for working with different models, Landau added.
“Due to the rapidly developing nature of this space and the various strengths and weaknesses of different models, we have centralized at the only provider,” he said. “We evaluate more providers in order to use each use to ensure best adaptation.”
Some other media factors take into account in the evaluation of LLM, include the cost, performance and quality of model outputs and personal data and security protection.
“It is extremely important for us that the models are not trained in the information or inputs of our users and that any interaction with AI products are safe and protected,” said Vadim Suppskiy, the main digital and information director of Forbes.
LLM must work well for the use of publishers to make it all worth it for its time and money.
In a selection that LLM uses for different processes, the TheSkimm tests models side by side and compares its outputs “to ensure that they are in line with the brand’s voice and editorial standards,” Landau said. This is because the main differences between these models are their “voice, tone and accuracy in different cases of use,” he added.
For example, Claude has a “softer, more natural tone” and is “particularly accurate”, and therefore Landau prefers it for tasks where the voice and tone of response is particularly important. But for tasks requiring less creativity – such as working with the TheSkimm data sets – but it prefers models like Meta’s Llama “for their reliability in providing accurate, answers and avoiding hallucinations”.
However, as LLM develops rapidly, these differences are “becoming finer”, Landau said.
Thekimm uses LLMS especially for three things: data analysis, publication and experimentation, according to Landau. These models help from these data, and cohorts help trend and cohorts of surface and help create audience segments to target messages and products, for example for their shopping and business companies and for sponsored content.
The third publishing house said that their company uses Google Gemini and its services to develop products such as building chatbot. For internal efficiency tools, Microsoft Copilot uses. The company is currently evaluating Google Gemini Integrate AI features into Google’s workspace (which includes products such as Gmail, Docs and Meet).
Mark Howard, Chief Operating Director in time, said he considered financial incentives to work with one LLM over another (Time has an agreement to licensing content with Openai), as well as the opportunity to participate in the future development of products in favor of business by having a seat at the table that helps to shape new products that can be beneficial to publishers.
“Some of the from.” [these deals] They are more about being part of the new markets they develop. And for me, there is a part that is most interesting – companies that really want to build something that does not currently exist. When they build them, I would rather be part of it, ”said Howard. Time is also engaged in confusion and broke to participate in the income of advertising income and its compensatory structure.
TheSeskimm currently uses a combination Open-source models (such as Meta’s Llama and French Mistral) and commercial products, including Anthropic Claude and Open GPT models.
The company weighs the “advantages and disadvantages” of paying fees for access to the LLMS private API and a potentially more cost -effective possibility of hosting his own open source models, Landau said.