Company
OSRAM
Title
Scaling Knowledge Management with LLM-powered Chatbot in Manufacturing
Industry
Tech
Year
2025
Summary (short)
OSRAM, a century-old lighting technology company, faced challenges with preserving institutional knowledge amid workforce transitions and accessing scattered technical documentation across their manufacturing operations. They partnered with Adastra to implement an AI-powered chatbot solution using Amazon Bedrock and Claude, incorporating RAG and hybrid search approaches. The solution achieved over 85% accuracy in its initial deployment, with expectations to exceed 90%, successfully helping workers access critical operational information more efficiently across different departments.
OSRAM's journey into implementing LLMs in production represents a fascinating case study of digital transformation in a traditional manufacturing environment. The company, which has evolved from a simple lighting manufacturer to a sophisticated technology provider, faced significant challenges in managing and transferring knowledge across their operations. The primary challenge stemmed from two main factors: * An aging workforce with critical institutional knowledge preparing to retire * Scattered technical documentation across multiple systems and formats, including legacy file types The manufacturing context presented unique requirements and constraints for the LLM implementation: * Need for high accuracy due to the critical nature of manufacturing operations * Diverse user base with varying technical expertise and language skills * Multiple data formats and sources that needed to be integrated * Requirement for offline documentation access * Critical importance of preventing hallucination in operational instructions The technical solution, developed in partnership with Adastra, showcases several sophisticated LLMOps practices: **Architecture and Infrastructure** The solution is built on AWS infrastructure, specifically leveraging Amazon Bedrock as the foundation for LLM operations. The architecture includes: * S3 buckets for document storage * Amazon OpenSearch for vector storage and hybrid search capabilities * Claude (via Amazon Bedrock) as the primary LLM * Streamlit for the user interface * Terraform for infrastructure as code * Comprehensive CI/CD pipeline for deployment and scaling **Data Processing and Retrieval Strategy** A particularly noteworthy aspect of the implementation is the hybrid search approach: * RAG (Retrieval Augmented Generation) for context-aware responses * Traditional keyword search for technical terms and specific documentation * Vector database storage for efficient similarity search * Pre-processing pipeline for handling various document formats **Production Safeguards and Quality Control** The implementation includes several important safeguards for production use: * Explicit handling of out-of-context queries with clear communication when information isn't directly available * Integration of conversation history for context-aware responses * User feedback mechanism for continuous improvement * Regular accuracy measurements and monitoring * Clear guardrails to prevent harmful hallucinations **Deployment and Scaling Strategy** The team adopted a pragmatic approach to deployment: * Initial focus on shop floor and R&D areas * Iterative improvement based on user feedback * Planned expansion to other departments and plants * Robust CI/CD pipeline enabling rapid deployment to new areas * Infrastructure as code approach for consistent deployment **User Experience and Training** Significant attention was paid to user adoption: * Simple, accessible interface design * Implementation of "artificial intelligence consultation hours" on the shop floor * Direct feedback mechanisms through thumbs up/down system * Focus on making the solution accessible to users with varying technical backgrounds **Results and Metrics** The implementation has shown promising results: * Initial accuracy rates exceeding 85% * Projected accuracy improvement to over 90% * Successful reduction in time spent searching for technical information * Positive user adoption across different technical levels * Recognition through an Industry 4.0 award **Challenges and Lessons Learned** The case study reveals several important insights for LLMOps in manufacturing: * The importance of combining multiple search strategies for technical documentation * Need for careful balance between automation and human oversight * Value of iterative deployment and feedback loops * Importance of robust infrastructure for scaling **Future Directions** The team has outlined several next steps: * Expansion to sales and marketing departments * Deployment to additional OSRAM plants * Enhanced user training programs * Continuous model and system improvements This case study demonstrates a well-thought-out approach to implementing LLMs in a production manufacturing environment. The focus on practical safeguards, user adoption, and scalable infrastructure provides valuable insights for similar implementations in industrial settings. The hybrid approach to search and retrieval, combining traditional methods with modern LLM capabilities, shows particular promise for handling technical documentation and specialized knowledge domains.

Start your new ML Project today with ZenML Pro

Join 1,000s of members already deploying models with ZenML.