

Overcoming Challenges with Flexible Design and In-Depth Interviews: How Marubeni Chatbot Maximizes In-House LLM Use
Overcoming Challenges with Flexible Design and In-Depth Interviews: How Marubeni Chatbot Maximizes In-House LLM Use
Contents
CHECK POINT
-
Marubeni Chatbot was rolled out to enhance the efficiency of Marubeni’s internal operations.
-
Focused on addressing internal challenges, Marubeni Chatbot reduces employee workloads.
-
High level of abstraction enables flexible tools not restricted to specific language models.
Development Started the Day After GPT-4’s Release
Since the release of GPT-4, OpenAI’s large language model (LLM) in 2023, the development and deployment of AI chatbots have accelerated across various industries. Many companies are now focusing on customizing these chatbots to fit their internal data and environments. Integrating internal data and tacit knowledge into LLMs helps unlock siloed information, boost operational efficiency, and reduce employee workload.
Marubeni began developing its own in-house chatbot, Marubeni Chatbot, the day after GPT-4 was announced. Recognizing GPT-4’s significant improvements in text comprehension and output capabilities, the company immediately explored its business applications. A small, agile team led a speedy development process, from defining requirements to designing the operational flow. As an internal organization dedicated to digital transformation, the Digital Innovation Department enabled rapid prototyping and development even within a large trading company.
Addressing Challenges Through In-Depth Internal Interviews
A common barrier to using chatbots is information security. Marubeni Chatbot was developed using Microsoft services to ensure robust information security while being capable of responding to queries involving internal information.
To truly enhance business efficiency, it is essential to continuously expand and refine use cases. Through interviews with various departments, document creation and responding to inquiries were identified as high-cost activities. As a result, the first feature developed makes it possible to generate responses and draft minutes based on internal regulations and user-provided documents. The chatbot is currently used for ideation across departments, with additional functions tailored to specific operations under development.
A chatbot’s value is diminished if it isn’t utilized effectively. Successful deployment of an in-house chatbot involves engaging with each department to understand their needs and fostering adoption through targeted events. For instance, Marubeni held company-wide generative AI study sessions in April 2023, when ChatGPT gained popularity, and again in October of the same year, when Marubeni Chatbot was launched company-wide. Sharing of specific use cases and department-specific seminars on leveraging generative AI are conducted to further increase user engagement.
The team also closely monitors daily usage with visual usage metrics and conducts surveys and interviews with heavy users to better understand their needs and improve functionality. As a result, the monthly active usage rate for the entire Marubeni group exceeds 50%, with Marubeni alone achieving over 60%.
The challenge in developing chatbots lies in the difficulty of fully defining requirements from the outset. Frequent feature additions and specification changes are needed post-launch to address evolving needs and advances in the LLM domain. The development of Marubeni Chatbot focused on designing its database and architecture to ensure flexibility.
Constant updates are necessary as AI technology rapidly evolves. The Digital Innovation Department has minimized costs by maintaining a lean team for development, operations, and maintenance. Strategies such as a DevOps approach, aggressive use of BaaS, and adopting distributed databases have been employed. The flexible design of the GPT-4-based tool allows for swift improvements. The system undergoes updates two to three times a week, thanks to a flexible design that enables rapid deployments like the immediate adoption of GPT-4o in May 2024.
Building a Flexible Tool Through High-Level Abstraction
When the Marubeni Chatbot was launched, GPT-4 was at the forefront, but today, many companies are developing LLMs, including Google and Anthropic. In this context, selecting the LLM that best aligns with business and user needs is crucial. Over-reliance on a single LLM could diminish overall performance, so minimizing dependence on any one LLM is a key feature of Marubeni Chatbot.
The key to developing in-house chatbots may be designing at a high level of abstraction. A higher level of abstraction provides greater operational flexibility. This approach allows a focus on user requirements without being constrained by system limitations.
Chatbots are merely tools; the true objective is to solve internal challenges. The continuously evolving Marubeni Chatbot demonstrates how a company can build a productive relationship with AI.