Why self-service analytics? Well, enterprises are looking for efficient and effective use of technology and time, a more data-driven manpower, reliable data as a source for insights and reduced dependency on centralized supportive functions as IT and BI. All in all it will be more cost effective when rolled out and used properly.

Self-service analytics is a very powerful way for an organization to become more data-driven in decision making. The analytics can be used throughout the organization on all levels: operational, tactical and strategic. Nevertheless, we should not underestimate the quality of the foundation that has to be installed; as the devil is in the detail, and the integration may just not be as simple as it looks.

The mechanism is twofold, a self-serviced analytics solution in the hands of the business user can be very powerful when it is utilized and rolled out in a controlled way, but can also be very dangerous, for instance: without quality assurance processes through a proper governance framework. Both aspects should be on top of the program’s list of priorities and all deliverables of the program should serve to bridge potential gaps.

In addition, a few prerequisite aspects must be in order before we can even start addressing data, insights or analytics. The critical path of the program presents itself already at the start of the plan making. Within this article’s context, the leadership of the organization should recognize and embrace the need to become more data-driven and center it around the heart of the organization.

Here are 7 tips to make self-service analytics successful:

1. What the user wants, or not?

Will it make my life easier?”, “What’s in it for me?”, “Only the relevant data for me please”, “In case of questions I need a help line”, “I want access to my data and insights wherever I am”. These are all fundamental questions users will ask. So, practice what you preach. You promise to roll out a service for the end-user, then a user-centric service it is, without making any doubtful concessions. A lot has to do with that user centricity. Countless clicks, endless procedures, which ultimately cause too many measures to occur, before the user gets access. This will surely impact the success of the service negatively.

2. Serve the purpose with the right tooling

The most important aspect is to just offer ‘easy-to-use’ tooling. Of course, some users may not be experts in performing their own analysis. That is why you need to offer a varied package with complementary types of tooling, certainly not limited to a fixed number of tools. Therefore, a tool should be flexible and scalable enough to be able to move along with the growth of the program. It must be tooling that serves the need for basic visualizations and for a bit of analytics, but should also be able to deal with the complex analytics and deep dives or a flavor just in between. Consider the people’s working context, their data readiness and analytics capabilities, ultimately advice what tooling is fit for the purpose.

3. Data maturity as a success factor

One desirable success factor is a decent data maturity, or at least a certain understanding of the concept of data. Users must understand what you can and cannot do with data and how they can benefit from using it in their daily operation. The risk however, is that without a proper and controlled roll-out, the freedom of using spreadsheets and sharing information will remain to exist. The roll-out plan should orchestrate the deployment of the service in such a way that people are educated and attracted to use the service. As a result of their increased enthusiasm, there is a greater chance that they will become ambassadors. Time-to-market is reduced substantially when you thoroughly think the next things through:

  • Without a formalized approval, to use the data, you simply cannot start. Therefore, discussions should be steered towards ‘data usage and ownership’.
  • Build structure and make agreements, for example about: data delivery, the validation of the data, refreshing and updates of data, and what to do in case of issues concerning data.

4. Accessing relevant data

Potential users should not be limited nor scared by complex processes to get hold of relevant data. After all, the expected ‘time-to-market’ for their data-driven decision making solution is relatively short. So, for example, leverage the existing centralized architecture and data warehouse as much as possible, as it is already holding substantial amounts of relevant data. This data is already cleansed, deduplicated and restructured and most likely adhering to business rules. Although the data ownership can sometimes be hard to track and trace, it pays off for the longer term to spend some effort to get it formalized. This should be done in the form of a written approval, provided by a person with absolute mandate, confirming that the data may be used. typically this is a member of senior management, head of that department or delegates.

In parallel with the program; work on a mandate for data processing, which holds a centralized agreement signed by all the data owners, who approve the access to load all corporate data (= another challenge). By using existing or installing new, seamless and solid background processes and gate-checks; the commitment for data ownership and the access process definition, which are the most important aspects of the governance framework, are covered. Obtaining data, requires ordering a relevant dataset with the user’s least possible effort. You might want to think of a process via an online data catalog and e-commerce order placement experience. With the correct allocations of rights, based on the ‘tool’ vs. ‘user’s maturity-mapping’, you are adhering to the data security and privacy regulations.

5. Just start!

One of the biggest challenges is the fact that you need to prove to be successful at a very early stage to build momentum for success at later stages. With change, in general you will rather get push back than be brought forward and this makes adoption extremely difficult, the risk is to not get started with the program at all. Mainly because of unawareness and the unknown capabilities that the program has to offer to the organization. The ultimate goal is converting ‘unaware’ and ‘unknown’ to ‘aware’ and ‘known’. The best way to realize this, is to demonstrate the benefits. Project sponsorship can facilitate, and a launching customer may be the solution to many start-up troubles. In most of the situations this is a team (or department) wanting to innovate and trial something new. Try to nominate ambassadors amongst these functions of people. There’s nothing better than ‘one of them’ openly confirming the success and added value.

6. Knowledge is power … and leads to success

Data maturity and savviness of course have their origin, be it from ‘learning-by-doing’ or following external training. As ‘controlled’ self-service analytics works best for people with at least some understanding of data, this knowledge must be provided, period! The program should facilitate in all ways and means to make people more knowledgeable, and actively communicate this. This means, trainings of all relevant topics around self-service analytics must be offered virtually, physically ‘1-to-1’ or in a classroom setting.

The preferred format to roll-out a use case based solution is a pilot or a proof-of-concept, which by nature are very much use-case driven. Why? Simply because of required budget and acceptable lead time; and remember you are looking for a rapid success story. With the success comes awareness and exposure, which is exactly what you need. Success breeds success, and everyone wants to experience that. Whether the result is small or big, the message should be broadcasted via all possible communication channels (= your marketing).

7. Short and long-term engagement

Start small and in a very controlled way, with high stakeholder involvement, build further on the success and scale up step-by-step with a supervisory roll-out plan under your arm. The identification of stakeholder groups, by mapping out data maturity and savviness, helps in establishing the parts of the organization where to kick-off with equipping users with self-service analytics. This approach brings focus, and along this process also the engagement starts.

Engagement is achieved physically and virtually. Key is to build a community with (potential) users as soon as possible and as wide as possible. One of the success drivers is a digital engagement platform. Such a platform allows you to interact with users relatively easy, on almost all the aspects of the service. Knowledge management, facilitation in training cycle, ordering data, interaction with colleagues, and by sharing experience and best practices. Attracting people to the platform – traffic generation – is mainly done via regular e-mail broadcasts and blogs. In addition, word of mouth, expert presentations and/or outings, such as a poster in the restaurant. It is surely not limited to those communication vehicles. The roll-out plan will direct you, on how to advance to a wider group within the organization.

Conclusion

Once a proper basis has been installed; preferably within all the regions where the company has an active and significant presence, the dependency on central IT and BI functions can reduce. And now it is time to maximize pilots and use cases, leverage success and expertise and grow a mature service within the program, and try to do that before it is an operational service.

Self-service therefore means ‘self’ ‘service’; being able to perform activities yourself. As a result, the burden on IT or BI resources will reduce over time, as the organization, with its stakeholders and users, is growing into more a data-driven organization. With the seamless controlled handover from program into operations, the right expertise will also remain in the organization on the right places (yes, plural). Users are expecting support in all areas in which they have been trained, especially in the beginning. Let’s establish a fair level playing field, in which both user and organization enjoy their benefit.

  • The intended objective: the program – and later service – should realize a predictable workload illustrating the allocation of people, the usage of tooling and data based upon the clearly defined and managed funnel with projects.
  • What most definitely to avoid: wild growth of initiatives without a proper roadmap or roll-out plan, and not using or leveraging proven capabilities and expertise.