Skip to Content
Client story

TE Connectivity boosts product development with a knowledge hub

Client: TE Connectivity
Region: USA
Industry: High-tech

Gen AI-powered research, developed in partnership with Capgemini and AWS, gives engineers access to previously siloed data


Client challenge: Product development teams needed to sift through countless documents scattered across dozens of incompatible systems to conduct background research.
Solution: Capgemini and AWS worked with TE Connectivity to create a Gen AI-powered platform that consolidates all internal research within an intuitive UI.
Benefits:

  • Productivity increased five to 10 times for product development
  • 2.5 million documents ingested in three months
  • Access granted to 8,000 engineers at launch

TE Connectivity (TE), formerly known as Tyco Electronics and now a global leader in connectivity and sensing solutions, has distinguished itself from the competition with the kind of cutting-edge industrial technology that makes our modern world possible. A broad range of industries, including automotive, aerospace, energy, consumer electronics, and healthcare, rely on TE’s engineering expertise and innovations to transform their operations. Every year, the company produces more than 235 billion parts in 140 factories around the world.

Closely tracking recent breakthroughs in generative AI, TE was enthusiastic about how the technology could help search and summarize internal documents. These feature important proprietary information, and the company needed them to be readily available to staff.

During the request-for-proposal process, Capgemini and Amazon Web Services (AWS) proposed building a solution that would harmonize TE’s diverse datasets, establish a central repository, and build an intuitive chat function in a clean user interface (UI).

“Based on that flexibility, that background, that proven experience, we felt Capgemini and AWS were right for us. We certainly haven’t been disappointed. That was the right choice,” said Phil Gilchrist, Chief Transformation Officer at TE.

The development process: Building an LLM with proprietary data

With 75 million engineering documents spread across 66 different databases, it was difficult for TE’s research and development (R&D) teams to find the right information. The scattered nature of the information also meant subject matter experts often had to answer questions about specific projects when researchers could not locate relevant reference documents.

There was a tight deadline for producing a solution. In just over three months, the team ingested a wealth of marketing and operational data and 2.5 million engineering documents – many of which needed to be scrubbed – into a modern UI. AWS supplied the cloud infrastructure and fully managed services, such as Amazon Bedrock, that enable the integration of high-performing, Gen AI foundation models, and Amazon OpenSearch Service, which makes it easy to deploy and operate various search, analytics, and visualization capabilities.

Meanwhile, Capgemini used retrieval-augmented generation (RAG), an architectural approach for retrieving response from large language models (LLMs), to integrate these services into an enterprise-scale solution.

“What they were able to do was launch a safe, secure solution into our security framework and our document structure almost immediately that’s very scalable, very secure, and something that so far has had a high quality of operation,” Gilchrist said.

A cross-company team continues to improve the tool on an almost daily basis with the ingestion of additional content (including all 75 million engineering documents by April 2025) and tweaks based on user feedback.

“Honestly, the team worked flawlessly together to stay on it,” Gilchrist said. “Everyone really wanted to make it work and we did make it work. So, I would say based on that common objective and a tight timeline, we had no choice but to work very closely together, and that was a fantastic experience.”

The transformative solution: TELme

The result was TELme, a conversational platform powered by Gen AI that collects and organizes the company’s diverse pool of internal knowledge about various industries and products in a single place.

TELme is TE’s Gen AI implementation based on Claude 3.5, the AI assistant created by Anthropic, and trained on proprietary data. It is all organized under a single application programming interface.

“Finding the right document was like finding a needle not just in one haystack but in 66 haystacks,” Gilchrist said. “TELme allows us to remove the haystacks and just find the needles.”

TELme establishes continuity and allows the company to hand down knowledge from one generation to the next. “We believe TELme will come to represent the sum of intellectual knowledge of the company in one form or another. But not only that: it’s a knowledge base that can be put into action. What that LLM will enable them to do is find the right piece of information right up front within seconds, rather than within a morning of trawling through extraneous documents.”

TELme goes beyond knowledge management. It provides an enterprise-wide research environment that fosters collaboration and communication.

    Generative AI

    As generative AI continues to advance, early adopter organizations will benefit from reinvented business models and processes.

    High-tech

    The world will never be the same. Will your company react to the next wave of disruption – or build what’s next?