How can an ontology driven approach enable an effective Big Data implementation

After the smash of the three Vs, companies are struggling to make big data actionable within their setup.
During the last year many analysts reported a common feeling from the field, related to the cultural changes that are needed to enable effective big data implementation (i.e. Forrester – analyst Brian Hopkins “Reset On Big Data Or Miss The Big Change”, Financial time – Richard Waters “Big data sparks cultural changes”, HBR – “You May Not Need Big Data After All”)

Technology evolution nowadays widely responds to the three Vs needs. The myth of the data scientist emerges like a tentative solution to address the needs of the organizations in terms of analytics development, to try to sift the gold in this data lake.
Despite this, many organizations do not move from the starting line of the big data lane and in many cases delay any decisions that nevertheless they should take to efficiently support their business needs. Cultural shift of the entire organization in terms of the ability to analyze data with a different mindset, is considered as the solution to this empasse.

The question is : how can a company support this shift without having in many cases a consolidated and common view of its information assets ? (if you think you’re not in this situation,” are you absolutely sure everyone in your organization knows concepts and data already managed by your core operating systems ?” and again : “how many times do you analyse the same transactional systems datasets, to understand the business logic for developing an analytic system ?”)

This scenario is made more complex if you consider the IT transformation needed in terms of infrastructure evolution, to support the shift and  transition from legacy layered environments such as classic DWHs infrastructures.
Companies have in fact developed tactical analytic solutions over the years . This has led to an ecosystem of siloed information management systems to support LOB needs, in which are buried away in multiple business requirements.

In order to support a strategic change to create and implement a future big data evolution scenario, organizations  need to create a solid base to enable this change to be common, permanent and effective.

We have integrated our BI delivery model, with an innovative approach based on the development of ontologies as the support to create an incremental, formal and shared representation of an organization. We call it Knowledge Intelligence and Discovery and it is a modular approach that integrates and sustains our offering’s model evolution.

An ontology is a formal representation that includes both internal and external entities, their interplay and relationships for a particular domain. It can be developed incrementally and can support all the phases of the IT solutions delivery process, starting with requirements definition to the development of a specific solution. It can be considered both a methodology to share a business common view within the organization and a knowledge management exploitation tool.
We have several success stories on the application of this approach, using ontologies development as : common knowledge representation, Enterprise common data model definition, enabler to Big Data strategy definition and legacy infrastructure upgrading.

To support this offering model, we have developed a methodological approach and a delivery framework by leveraging the features of solutions by a panel of global partners, that represent the best of the breed in their sectors.
The framework is highly adaptable and applicable, even in context with consolidated infrastructure. For this reason it allows IT departments to freely define their infrastructure evolution roadmap, while developing big data initiatives.I have no doubt that this approach based on the definition of a common knowledge layer, will help our customers to win the challenge of the cultural shift needed to enable a different mindset towards big data and analytics.

I have experimented that having a graph defining a domain as a base, business and IT people are well-disposed to discuss, finding and defining a common ground of knowledge. This is only the beginning of an incredible journey.
You will be in fact surprised to observe that the activity related to the process of an ontology definition, helps to create a state of collaboration within  various business departments.

I personally agree with the principle that to enable a process of managing big data, an organization must first shed light on its internal information assets. This is one of the reasons why I’m so excited about this approach, which enables, through methodological and technological tools, the awareness of the knowledge within the organizations, building in the meanwhile the layer for technical enhancements strategy.

And it is incremental ! That means you can start with little steps, having the big picture in mind.

I need to give you more details and that’s why I invite you to stay tuned to this blog. 

Related Posts

AI and analytics

Spotlight on Capgemini NA @ Informatica World 2018 | May 21–24 in Las Vegas

Jackson, Dusty
July 10, 2018
Spotlight on Capgemini NA @INFA World 2018 with key representation from Dusty Jackson, Scott Sweet, Keith Reid, Steve Jones, Goutham Belliappa and Mansoor Aleem
Consumer Analytics

Bullwhip effect applied to a data supply chain

Denis Sproten
June 22, 2018
Take a look at how the bullwhip effect translates into the data supply chain built for your organization.
Artificial Intelligence

Even the artificial intelligence you buy is prejudiced

Reinoud Kaasschieter
June 21, 2018
When wrong data is fed into the algorithms, they also make the wrong decisions. Learn why do bots contain biases.

By continuing to navigate on this website, you accept the use of cookies.

For more information and to change the setting of cookies on your computer, please read our Privacy Policy.


Close cookie information