Certain aspects of business intelligence are drowning in their own bureaucracy. Is it possible today to put in a solution within a month that just gives you the real answers you need for your business? Or does an enterprise have to fend off solution providers’ efforts to mould its business around their software, rather than the other way around? What’s happened to the platform versus best-of-breed argument? Is the complexity too much for the all-in-one stack? For Craig Stephens, principal solution manager for business intelligence at SAS Institute, the most complexity in BI projects happens around integration.
“People expect to be able to access information on more devices. I think we’ve been party to our own downfall in some of these cases where essentially we’ve promised we can deliver across all platforms but have run into complexity working with other vendors that have grown through acquisition. So we’ve had to deal with their integration components on top of ours, as well as the customer’s specific requirements. There’s a lot of mish-mash of technologies because people haven’t grown much organically. There are two camps to BI: one is to get real intelligence for the business and the other for regulatory and compliance projects. For a compliance-type project, it’s a long cycle with box-ticking and change control. The more agile guys are using BI to get information out and so can afford to be far more flexible.”
Craig Stephens, SAS Institute
Mark Bannerman, country manager for MicroStrategy SA, says the integration problems are for historical reasons.
“If you think about where we’ve come from, BI was departmental and with a lot of segmentation. In many ways that made things a lot simpler because you weren’t deploying across the entire organisation. Then we had a drive towards standard information across all of the business and that introduced the complexity of having to tie the various areas together, or at least having one view of it. The supposed answer to this was going to be the stack: a single vendor who is fully integrated across all solutions and I think the last two or three years, it’s proved to be a panacea that hasn’t happened. A lot of people have standardised but haven’t really enjoyed the experience as much as they’d have liked. Particularly, innovation has slowed down a bit.”
The other major problem is both the quality of data and the increasing pace of its creation. Has data quality improved? Gerald Naidoo, group CEO of Logikal Consulting, says it has a bit but with a caveat.
“Ninety percent of the data in the world has been created in the last two years. From an integration point of view, complexity comes in with an organisation’s need for mobility and two other areas: technological complexity and business process complexity. The quality of data in most organisations is about 60 percent. The extract transform load process (ETL) is where the wheels come off the Porsche. You can have the best BI tools in the world but if you have poor data, then you’re wasting your time.”
SAS’ Stephens doesn’t think there’s been too much change in the quality of data in the last five years.
“Although the data that’s being created is moving far away from structured to unstructured, companies still haven’t fixed their structured data,” he says. “I think one of the things BI vendors are trying to push is a pseudo-ETL capability in their tools to address that exact problem so you can transform and clean some of it. But once again the fundamental issue is one of data integration as opposed to trying to get data out there.”
And the data problem isn’t going away, notes Bannerman.
“When you take the combination of smartphones, iPads, the cloud and the internet, data is growing exponentially. Complexity is also an aspect of the sheer volume of it. It becomes a more difficult task each year. A lot of the vendors, ourselves included, are talking about Big Data and the ability to sort through it.”
Social networking is another generator of Big Data, says Naidoo, and that tends to make a hard problem even harder.
“We’re getting interesting queries from customers about managing data from social networking. That is a huge issue. The real issue is that when you look across products, there is no one BI product that can address every single BI requirement. Some products are good at sales, others are good at data marts and extraction. So there’s a leader in the mart space, a leader in extraction and so on. So how does a business standardise on tools? You don’t have one company that dominates everything. Invariably, it is a very tricky thing for a customer to add an ETL, add a pseudo-ETL, doing the cleansing and at the same time then selecting a BI tool.”
Bannerman has experience with trying to mine Facebook and it’s not pretty.
“Facebook is possibly the most illogical underlying data structure and it is certainly not a simple or straightforward environment in which to work. Now add to that the massive volumes of data and the massive volumes of interactions. Like many vendors we’re working on a gateway into Facebook and I can tell you that the complexity and the challenges are enormous.”
The right approach
So what is a company to do?
When enterprise software was a platform discussion a few years ago, and business intelligence was part of that platform, the decision was easy. But with the tide turning back to best-of-breed, is it still a good idea to choose a single vendor for everything? MicroStrategy’s Bannerman says it’s not a good idea for the vendor to try to be all things to all people in the first place.
“If they do that, you end up with an organisation that is typically stretched across too many different areas. Fairly shortly afterwards, they go from being leaders in individual areas to also-rans. In the Gartner quadrants, for example, during the years of the megavendors, the big boys raced right up into the top right but now they’re falling back down again. The independents are the ones who are flexible and agile enough to solve problems and react to trends.”
Stephens says the approach to choosing BI is common sense.
“If you carve it up properly, you will sit down and create a business functionality matrix. What do you need? A query and reporting tool, a slide and dice, something for Office integration, something for web content and so on. You will find a big end-to-end platform may not have the entire functionality you need. Then there are the small, quick problem-solving tasks where you need to do something with analytics. If you don’t have a platform in place, it has to be departmental. What does a retail bank do to deal with customer churn right now? It goes out to get the quick and agile tools that can be implemented in a month.”
“Some tool is better than no tool. Rather have point tools that have a short ROI than have no tool. Of course if you’re still paying for a BI tool after a few years, you’ve bought the wrong tool. Remember the most important quality of BI is to be able to forecast. That way you can properly run sales and resource analysis.”
But how you put in a tool is key. Stephens has two contrasting examples.
“What works best for the big enterprise? I’ve seen in the last 18 months two of the top banks doing the same kind of modernisation programme of their overall analytical platforms in the credit space. One was extremely rigid and was run through enterprise project management. The other was far more flexible at a departmental level but even at the top level, they were able to collaborate between cards, home loans and vehicle finance. We estimated they would take about six months and they did it in three. The other bank has, after 18 months, only just started getting the users to migrate. So they’ve spent 18 months in process. Within the regulated organisations, everything was segregated to the nth degree. So the Unix administrator could not talk to the security administrator: it had to be a change request that required seven approvals. To add a user to the system could take four days in case one of the guys was on leave. That process is going to kill you. You are going to lose business.”
Lack of analysis
Bannerman has seen lack of analysis burn some telcos.
“Some of the telcos are struggling because they have poor, low-value customers who burn all their bandwidth and network. They haven’t done any analysis on which ones they don’t mind churning and which ones they keep.”
What’s next? Naidoo says in-memory technologies as customers demand more speed.
“For us as implementation partners of BI vendors, the biggest complaint is one of speed. Some tools have better interfaces than others but when you finally figure out how to get a report, it then takes four hours to get it out. That’s why in-memory is very exciting. If BI is to be a catalyst for real forecasting and business decision-making, it needs to be immediate. If I am a CEO and can pull up live numbers on my iPad, then BI is a real swinger. I will make a decision to do new investments.”
Stephens says this is exactly the reverse of a decade ago: when BI was only let loose on copies of offline data.
“Ten years ago, when we evaluated BI tools, one of the criteria was that it had to be offline. Now it’s exactly the opposite. We all want the latest numbers now. And it’s the mobile aspect that is pushing the boundaries of real-time processing, hence the move to in-memory. CEOs want highly visual, condensed numbers to view right now on a mobile device, and they want to be able to drill down further. They’re not going to be happy lying in bed knowing their sales numbers are down until they know why.”