Skip to content

Good Data
Syndicate content
The stories, advice and ideas from GoodData\'s business intelligence thought leaders.
Updated: 3 hours 47 min ago

Embedded Analytics and AI: The Future of BI

3 hours 47 min ago

Lately, I’ve been thinking a lot about how I define Business Intelligence and where I see the industry heading. More and more, this connects back to embedded AI, machine learning, predictive analytics, data enrichment and other AI methods, but at the end of the day we all need to be on the same page.

At its core, BI is all about demonstrating profitable activity with analytics and enabling businesses to use their data to drive actions and outcomes. The popular thinking is that self-service BI tools are easy to use and good for the business, and business users like the idea of having their own data analysis tools. But what organizations are failing to realize is that they will be far more productive and profitable if they use AI and machine learning to automate the mundane decisions that most people use current BI tools to make and allow their employees to focus on more strategic problems.

I don’t believe that the concept of self-service analytics is a scaleable one. Employees should be spending their time on their core job functions, instead of slicing and dicing data. Embedding analytics at the point of work and automating mundane decisions with machine learning and AI enables business users to take immediate actions to improve business outcomes. Employees can focus on their day-to-day responsibilities, while letting machine learning take care of automating tasks that don’t require the expertise, experience and context that only a human can provide.

I’ve been inspired by JP Morgan’s COIN program, it’s an outstanding example to showcase how a business can leverage machine learning to automate decisions, and in the process save themselves millions of dollars a year. COIN is a learning machine that automates in seconds the mundane task of interpreting commercial-loan agreements that, until the project went online in June, consumed 360,000 hours of work each year by lawyers and loan officers. This perfectly illustrates the business value that machine learning can have.

Both startups and some of the world’s leading technology companies alike are pouring investment into AI technologies and machine learning capabilities, and those that aren’t are behind the curve. Tractica predicts the market for enterprise applications of AI to surpass $30 billion by 2025, with a focus on better, faster, and more accurate ways to analyze big data for a variety of purposes. Those investing in advanced AI and machine learning capabilities will lead the BI industry as it moves more and more towards automation.

Right now, the BI industry remains focused on self-service capabilities, but I think there should be less emphasis on self-service flexibility and more on automation and AI. At GoodData, we are focused on creating solutions that support embedding AI at scale to automate the basic business decisions that people mostly use BI tools for, which is why we will keep investing in predictive analytics, machine learning, data enrichment, and other AI methods.

GoodData wants to be part of the production environment, and we believe that the best way of doing that is deploying Smart Business Applications that harness the above technologies to make organizations more productive by allowing employees to spend less time analyzing data and more time focusing on daily tasks. Smart Business Applications involve the seamless integration of BI tools with day-to-day business applications and workflows. They work by delivering relevant, timely, and actionable insights within the business applications where work is already being done, so decision makers no longer have to stop what they’re doing and open another app to get the insights they need. By putting these intelligent insights right in front of them the moment they need them, GoodData is ushering in the next generation of BI.

Categories: Companies

From Descriptive to Prescriptive: How Machine Learning Is Fostering a New Era of Analytics

3 hours 47 min ago

Imagine going to your doctor with a problem. He gives you a thorough checkup, looks at the results, and tells you, “You have a bad case of [fill in the blank].” Then he pats you on the shoulder and says, “Good luck with that.”

It sounds ridiculous, but that’s what many organizations have been getting from their analytics and Business Intelligence platforms: a diagnosis with no prescription. Today, thanks to advances in machine learning and artificial intelligence (AI), a new era of BI — one that incorporates predictive, prescriptive, and cognitive computing as well as automation — is now possible.

In a recent article in BetaNews, GoodData Vice President of Product Bill Creekbaum explores the new BI capabilities that machine learning and AI have enabled:

Data is being analyzed faster and more accurately with these advanced analytics frameworks, and decisions are being automated with machine learning to decrease human error and increase the organization’s bottom line profit. Machine learning can detect new patterns and opportunities that humans cannot.

 

When organizations implement these advanced frameworks, analytics are no longer a matter of “what is happening?” but of “what will happen next … and what can we do about it?” By embedding analytics with AI capabilities at the point of work, they can deliver insights at the time and place where they can drive better business actions. And they can automate simple, routine decisions, allowing business users to focus on solving more complex problems.

Despite the tremendous advantages that AI-powered analytics offers, Creekbaum notes that adoption has been slow. According to a recent survey, only 28 percent of respondents reported having experience with machine learning, while 42 percent said their organizations lack the skills needed to support it.

How long can organizations continue to avoid integrating AI into their BI frameworks? Not long, according to Creekbaum:

Legacy BI platforms can no longer keep up as the world becomes more digital and automated. In order to remain competitive, businesses must be looking toward BI solutions with AI and machine learning capabilities. AI is the future for BI, and those that aren’t investing will lag behind the competition.

 

For more of Creekbaum’s insights, read the article on the BetaNews website.

Categories: Companies

Artificial Intelligence is the Future of Business Intelligence

3 hours 47 min ago

A major focus for Business Intelligence (BI), data and analytics vendors has been delivering tools that are ‘self-service.’ These solutions are based on the premise that with just a few easy clicks, users of all skill levels (regardless of their ability with data and analytics) can quickly sift through volumes of data to make radically better business decisions. But there’s a big problem; as the flood of data that businesses generate as part of their normal operations continues to increase exponentially, the ability of current tools to provide clear insights is rapidly degrading. And expecting everyone in the organization to become data scientists and business experts, in addition to performing their core business responsibilities, is simply unsustainable.

"By 2018, most business users will have access to self-service analytical tools,” said Anne Moxie, a senior analyst at Nucleus Research. “But the fact remains that there’s too much data for the average business user to know where to start.”

The fundamental issue is that even with these ‘self-service’ tools, most business users lack the analytical skill to reliably identify the intricate patterns in data that could be important information, or nothing at all. And more importantly, they shouldn’t have to.

In a recent article in Data Center Knowledge, GoodData CEO Roman Stanek argues that radical advances in computing power, predictive analytics, machine learning and artificial intelligence have opened the door to a new generation of analytic tools. If implemented properly, these systems can finally make good on the promise of Big Data by instantaneously pulling actionable insights from complex data sets and automatically surfacing them as recommended business actions, where and when they are needed most.

This isn’t some far off science fiction future. Machines are already getting better at extracting insights from complex data than humans are, and these cognitive abilities will only improve. Rather than continuing to pour money into tools that require employees to spend their time manually analyzing data and making mundane decisions, business organizations should be investing in next-generation systems that automate the bulk of these processes and allow their talent to focus on the strategic problems that really move the needle.

To learn more about the intersection of artificial intelligence and business intelligence, check out Roman Stanek’s article in Data Center Knowledge.

Download the Nucleus Research GoodData Guidebook to learn more about the next generation of predictive analytics and Smart Business Applications.

Categories: Companies

When Bridging the IoT Skills Gap, Product Teams Lead the Way

3 hours 47 min ago

Has the technology of the Internet of Things (IoT) evolved faster than the skill sets required to leverage it in achieving business objectives? Given the scarcity of job postings with “IoT” in the title, it would appear so.

In a recent article, Internet of Business columnist Adam Bridgwater posed this question:

“With a looming skills gap in front of us, how will we consolidate skills in the software engineering realm of developers, architects and device specialists to provide for our IoT-enabled future?”

For answers, he consulted executives from three software industry leaders, including GoodData CEO Roman Stanek.

Product Management Leads the Way

In his response to Bridgwater’s question, Stanek explained that the skills needed for IoT success fall squarely within the product management role, where one finds awareness of the vision, strategy and security needed to handle IoT data effectively. The new skills required to execute on IoT initiatives will fall into two categories:

Broadening the Scope

While IoT strategy should be owned by product management, in close collaboration with the security team, multi-functional collaboration across teams is key for IoT success.

“IoT needs to be collaborative because traditional expertise needs to be extended,” Stanek said, in the same way that today’s auto industry requires knowledge that goes far beyond carburetors and cylinder blocks.

For more of Stanek’s insights, read the article on the Internet of Business website.

Want to learn more? Click here for a webinar on Profiting from Actionable Analytics.

  • “How to Design”: How IoT fits into the product
  • “How to Protect”: The security aspect of handling data from IoT devices
Categories: Companies

From Data to Insights to Action: How Smart Business Apps Are Closing the BI Loop

3 hours 47 min ago

Not so long ago, the user experience of any app was determined by the technology. We all used Outlook because at the time it was the only option available, and users had to either adapt to the interface Microsoft gave them or … well, there was no “or.” Until Gmail showed up and showed us all that there is a better way.

Today, the user is in charge. Features like intuitive design, personalization, and voice control have become standard in consumer-facing technology (think Siri, Alexa, Uber/Lyft.) Users are now expecting similarly tailored experiences from their business applications. And with so many options available in all categories, they aren’t “stuck” with apps that fail to deliver the experience they want.

As data professionals, the question we need to ask ourselves is this: “Are traditional BI apps user-centric enough to meet the demands of today’s decision makers? And if not, can they make the leap?”

BI Yesterday and Today

Before we dive into that question, let’s take a look at the evolution of business intelligence. The very first BI apps were developed when companies had limited amounts of stored data. Their basic function was to centralize information and deliver “dashboards” to serve as a readable interface between users and raw data.

Eventually, BI evolved into standalone apps that data analysts or BI team members used to perform ad hoc analytics and distribute the results to decision makers. These apps were more sophisticated in terms of the data they delivered, but were still not capable of driving action. In other words, they were still static dashboards, and executives could look at them all day long and still not understand which important changes were taking place … or which actions they needed to take.

Today, enterprises have data coming at them from all sides: big data, unstructured data, data from the “internet of things” (IoT), and data from countless third-party systems. For present-day decision makers, static templates with predetermined sets of questions are no longer enough. They need a contextually aware solution.

Enter the third phase of BI evolution: smart business applications. The ability to gather, store, and report on data is yesterday’s news; now it’s all about making the data work for the user.

Why Smart Business Apps?

Smart business applications represent the integration of BI tools with day-to-day business applications and workflows. They deliver relevant, timely, and actionable insights within the business application where the work is being done. Decision makers no longer have to stop what they’re doing and open another app to check the data. Now the data is right there in front of them the moment they need it.

Smart business apps release data from the exclusive realm of data scientists and BI teams, making it readily available throughout the organization to service the needs of both the ‘data elite’ and business users.

Let’s look at a real-world example.

Say an organization uses a fictional app called “ExpenseHero” for their travel and expense management. Here’s what the workflow within the app looks like:

  1. An employee submits an expense.
  2. An auditor performs simple rule-based QA tasks such as checking whether the amount on the receipt matches the line item in the report.
  3. The request is forwarded to a manager, who approves or rejects the expense.

Now, we all know that managers are busy — really busy. What are the odds that they have the time to thoroughly review every expense request that comes across their desks? Practically nil, and that’s where far too many companies lose far too much money. It’s not about employees committing fraud or acting irresponsibly; it’s about the lack of sufficient time and resources to enforce the guidelines that are in place.

Managers may well have access to dashboards telling them what their spend has been this quarter, how much of their budget is left, and who their top 10 “spenders” are. What the dashboard doesn’t tell them is which actions they need to take and when to take them. It’s a good “pulse check” to look at on a monthly or quarterly basis, but it has little to no use in making day-to-day decisions.

So, how can we make this workflow “smart” in a way that enforces good hygiene in expense reporting without adding to the manager’s workload? Here’s what that might look like:

  1. The manager opens the expense app and sees the requests that need to be approved or rejected.
  2. Instead of opening a separate app to see the data that can inform her decisions, she sees a small recommendation window as she hovers over each request. The window offers a recommendation to accept or reject the expense along with justification for the recommended action.
  3. The manager accepts or rejects the requests on her screen, confident that her decisions have been guided by actual data.

This little recommendation window may not seem impressive, but it represents a tremendous amount of activity going on behind the scenes. The app probably has to run a machine learning algorithm to create that recommendation, taking into account a wealth of data to create a robust predictive model. For example, if John Smith typically submits expense reports in the range of $4,200 to $5,600, but suddenly he submits a request for $7,200, the app can recommend either rejecting the expense or giving it a closer look.

To deliver a reliable recommendation, apps must often reach outside the company for relevant information such as census data and details on key events. I’ll give you an example from my own experience: I once had to travel to Minnesota on business, and my trip happened to coincide with the Ryder Cup golf tournament. If you know anything about the hotel industry, you know that room rates can go up quite a bit during major events like this one. Sure enough, my room expenses were well outside my normal range, and the finance team rejected them.

Smart business apps can also tap into your CRM data to further refine their recommendations. Getting back to John Smith and his $7,200 request, let’s say that John is your top sales rep and that the size of his close amounts more than justifies the expenses he incurs as part of the sales process. Once the system “learns” this about John Smith, it can automatically adjust the parameters for accepting or rejecting his requests.

It’s an exciting time for business intelligence. With the arrival of smart business applications, we are closing the data loop by helping users turn insights into action. And by embedding those actionable insights directly into the application, we are

  • Eliminating any bias or irrelevant details (“noise”)
  • Making complex machine learning models — the ones that were formerly reserved for analytical elites — available across the organization so that everyone can put data to work for them
  • Making incremental improvements to decision-making processes that can lead to substantial bottom-line benefits over time

Download the Nucleus Research GoodData Guidebook

Categories: Companies

Nucleus Research releases GoodData Guidebook

3 hours 47 min ago

In today’s crowded market, the measure of success for embedded analytics is no longer limited to analyzing data to improve internal operational efficiencies. Now, in order for organizations to extract the maximum benefit from their data, they should look to monetize it.

So says Nucleus research in their recently released GoodData Guidebook, which explores how different companies use GoodData’s analytics platform and services to monetize their data assets and extract latent value from their data. In fact, Nucleus found that embedding GoodData’s analytics into their offerings was a significant differentiator that played a critical role in closing 80 percent of partner sales.

For their report Nucleus interviewed multiple companies that had sought out vendors for advanced embedded analytics solutions. This report identifies the key reasons these companies chose GoodData, including:

Getting Analytics Where You Need Them

Nucleus found that a major requirement for the new generation of analytics tools is the ability to embed insights at the point of work. Previously, analytics were treated as separate, standalone systems. This created adoption problems, as business users who aren’t primarily responsible for data analysis will be reluctant to add those tasks to their workload. The solution, according to Nucleus, is to embed this functionality into other core enterprise applications. By doing so, analytics vendors can offer in-context insights to users, who can easily incorporate that analysis into their daily workflows to make more informed choices and take smarter actions every day.

Harnessing the Advantages of the Cloud

Cloud-based options were a top priority for users and partners when evaluating analytics offerings. Previous research from Nucleus found that cloud-based solutions can offer 2.3X ROI compared to on-premise solutions. This is because cloud offerings reduce hardware and software costs, lower consulting costs, and are faster to implement and easier to upgrade.

Direct Revenue Generation and Reduced Costs

In their report, Nucleus found that companies who partnered with GoodData were able to generate net-new revenue opportunities with externally facing data applications. Offering these data products played a critical role in closing 80 percent of sales while shortening the sales cycle, boosting retention, and creating an average of 25 percent increase in upsell after the close.

On the other side of the coin, GoodData’s platform allowed these same companies to dramatically reduce costs on data management, analysis, and distribution as well as other areas.

The GoodData Difference

The analytics market continues to expand at a prodigious rate. But according to Nucleus, the focus on data monetization and a customer-first mentality set GoodData apart.

“Nucleus expects that the culmination of GoodData’s ability to monetize data assets for its partners, along with its ongoing commitment to customers, will allow the company to experience long-term growth and stand out in the field.”

For more insights on the benefits of cloud-based analytics solutions, you can download the full report from Nucleus Research here.

Nucleus Research: GoodData Guidebook

Partners reported that they were able to use the GoodData platform and services to monetize their data assets, which created a new source of revenue and allowed them to extract the latent value of their data.

Download Guide
Categories: Companies

How AI Will End Offshoring

3 hours 47 min ago

We are living in the Rise of the Machines. No, Skynet and its hordes of Terminators haven’t materialized just yet, but there are many that fear that advances in automation and Artificial Intelligence are a serious threat to humanity in at least one aspect: jobs. In America, more jobs are already lost to robots and automation than to China, Mexico or any other country. But the global impact of the these systems will be felt even more strongly. I believe that the proliferation of ‘virtual talent’ will have a profound effect on intellectual offshoring and business process outsourcing, one that will be especially pronounced for emerging countries.

Traditionally, outsourcing and offshoring has primarily been conducted as a cost reduction measure. After all, why spend more on an expensive local worker sitting at a computer when the same tasks can be performed at a dramatically lower cost by an overseas worker sitting at the same computer, all while maintaining the same level of quality? In the pre-AI world, that thinking made perfect sense. But what if you could make the computer do that same job, without a human operator? The cost savings would be massive, and the business decision obvious.

Advances in AI technology are rapidly making this hypothetical a reality. Recent research from Gartner found that by 2018, 40% of outsourced services will leverage smart machines, “rendering the offshore model obsolete for competitive advantage.” And this market is only going to grow. The same report states that over $10 billion worth of these systems have already been purchased from the more than 2,500 companies providing this technology, while more than 35 major service providers are investing in the development of "virtual labor" options that leverage AI-driven service offerings.

All of this means that the intellectual offshoring we've seen since the 90s will no longer be needed or even viable, as there won’t be any business requirement for these services or economic incentive to move these tasks overseas. AI and advanced analytics allow for the automation of many tasks that are currently outsourced. That’s an extremely attractive option; automating tasks that are currently performed by hundreds of overseas employees will enable businesses to hire more expensive local talent who can focus on the difficult tasks and strategic decisions that make bigger business impacts.

AI not only softens the incentive of cheap foreign labor, it also negates the advantages of lower offshore operational costs. If you can locate the machines that are performing these tasks anywhere on earth for basically the same cost, why not keep them close to your home base of operations and save a bundle on travel, audit and compliance costs?

These shifts might take a few years as technology develops, but they are coming, and they will fundamentally change the way the world does business. Intelligent machines are here, and companies that continue to rely solely on outdated offshoring models out of fear of the risks and challenges of being early adopters of these systems do so at their peril. Virtual labor technology can offer potential cost savings of between %10-%70, so business leaders must begin planning now for how to adopt this technology and adapt their organizations to maximize its potential in order to survive and remain competitive.

Categories: Companies

Winning the Embedded Analytics Go-to-Market Race: GoodData at SaaStr Annual 2017

3 hours 47 min ago

With demand for their products on the rise, SaaS leaders find themselves faced with an extraordinary opportunity … and an equally intense challenge.

The opportunity is a booming market. Forrester reports that fully ⅓ of enterprises complement their existing offerings with SaaS, with analytics and intelligence solutions taking center stage. Of course, with expanded opportunity comes the challenge of expanded competition — especially in a field with a low barrier to entry — and the need to differentiate in a crowded field that seems to grow on a daily basis.

The good news is that SaaS brands already hold a vital key to differentiation: your data. Embedding analytics in your SaaS solution can help you drive serious results quickly like MediGain, who achieved 1044% ROI on their analytic investments with a payback period of only a little more than a month. Others, like ServiceChannel, leveraged embedded analytics to increase engagement from 30% to an incredible 80%, while Demandbase helped some of their customers boost conversion by 200% by integrating GoodData’s analytics into their Performance Manager SaaS product.

And as with most competitive advantages, your embedded analytics product must get to market quickly, ahead of your competitors, if it’s going to succeed.

This speed-to-market is the theme behind GoodData’s participation in this year’s SaaStr Annual conference and trade show. Every year, SaaStr brings together the top professionals and executives from across the SaaS community to learn from the industry’s most prominent thought leaders. This year’s event, coming up on February 7-9 in San Francisco, is expected to draw 10,000 attendees for networking, learning and fun.

The GoodData team will be on hand at Booth 4 with live demos and expert advice on getting to market faster with your embedded analytics product. Schedule a time to meet with us here, and let’s talk through your biggest questions around executing a successful product launch.

SaaStr Annual attendees will also be treated to a presentation from GoodData partner Zendesk: co-founder and CEO Mikkel Svane will take the stage to share the story of his brand’s rise to become the dominant SaaS provider in the space. Be sure to check out Mikkel’s presentation, “Zendesk: From Day 0 to Today: The Lessons Learned” on Thursday (2/9) at 9:30 am.

We’ll see you at SaaStr Annual, and remember to book your meeting with our team while spaces are still available!

Categories: Companies

MediGain Brings Automation and Analytics to New Heights with GoodData

3 hours 47 min ago

MediGain had an absolutely incredible 2016. They received a 2016 Technology ROI Award from Nucleus Research for their implementation with GoodData’s analytics platform, a deployment that ultimately generated a 1044 percent return on investment for the company. And according to a recent article in American Healthcare Leader, MediGain’s CIO Ian Maurer has now made it his mission to take the company to a whole new level as far as automation, analytics and security are concerned.

While many of Maurer’s initiatives focus on data security, a key area of innovation has been MediGain’s analytics capability. When he first joined MediGain, their Business Intelligence team was stuck creating month-end reports for each client with an old system that was very slow and required massive amounts of manual data work.

“It was a very manual process,” Maurer said. “It required extracting data from client systems and converting it to Excel format, and then going in and scrubbing that data, creating pivot tables and all this manual data manipulation to create a report to send out as early in the month as possible.”

As the company was experiencing rapid growth following a series of acquisitions, a quantum leap in analysis efficiency over their legacy on-premise analysis system was needed in order to automate these procedures and meet the increased demand.

“I decided on a platform as a service solution with GoodData, and using that tool I was able to create practice management vendor specific automated extracts,” said Maurer. “Once per day, we pull data from our client practice management systems and aggregate and normalize it all so it’s standardized from client to client, and we could look at it in an aggregated format and see all the clients rolled up, and see how we are doing as a company, and see their financial performance over time.”

Partnering with GoodData revolutionized MediGain’s analytics offerings. Their new analytics platform eliminated the errors that had plagued the old system, while freeing up time and resources for their BI team to focus on interpreting the data and managing other projects rather than running reports all the time. In fact, Nucleus Research found that GoodData’s platform not only increased the efficiency and efficacy of MediGain’s BI staff, but also enabled them to offer new client-facing packaged analytics services for financial analysis. This implementation has enhanced MediGain’s value proposition, attracting new clients and boosting their revenue.

Learn more from Nucleus Research about how MediGain drove 1044% ROI with advanced analytics

Categories: Companies

The Three Secrets to Successfully Monetizing IoT Data

3 hours 47 min ago

The global business landscape is being transformed by the Internet of Things. In the emerging Connected Economy every business will have to become an IoT business, and enterprises that don’t adapt, innovate, and transform their models will risk falling behind. However, the reality is that IoT monetization is going to be difficult, especially IoT data monetization. In fact, I believe that more that 80 percent of companies that begin IoT implementations will squander their transformational opportunities.

For any company that has customers, suppliers, employees or assets, IoT monetization can be transformational. But success in IoT monetization comes down to more than just technology; it requires careful strategic planning and shifting your entire corporate mindset to treating data as the valuable asset that it is. To learn more, I encourage you to check out the article I wrote for BetaNews, where I detail my top three strategies for ensuring that your IoT initiative is part of the 20 percent that actually succeed.

Oh, and one last thing; as 2016 draws to a close, I want to take this opportunity to wish you Happy Holidays and a fantastic 2017 from all of us here at GoodData!

- Roman Stanek

Click here to read Roman’s ‘Three Secrets to Successfully Monetizing IoT Data” in BetaNews

Want to learn more? Click here to download Gartner’s report Prepare to Monetize Data From the Internet of Things

Categories: Companies

Profiting From Actionable Analytics

3 hours 47 min ago

On Thursday, December 15th, I had the pleasure of co-presenting with Martin Butler a webinar titled "Profiting From Actionable Analytics”, where he and I explored how enterprises can leverage their investments in data (whether it’s big, transactional, or IoT data) and drive profitability via actionable analytics. Martin looked at industry trends and thought leaders to uncover best practices, while I focused on how the GoodData platform leverages our Cognitive Computing, Distribution, Connected Insights and Analytics services to deliver Smart Business Applications that drive analytic-driven action, whether it's an automated, recommended, or in-context business decision.

I invite you to check out the recording of the webinar.

Categories: Companies

What Fighter Jets Can Teach Us About BI Modernization

3 hours 47 min ago

Business is War. And just like today’s battlefields, the modern business landscape is more crowded, competitive, and lethal than ever. In both arenas, it’s no longer just survival of the fittest; it’s survival of the smartest. Success depends on using every ounce of information and intelligence at your disposal to make smarter decisions and take actions faster than your opponent. So what can a next generation fighter jet teach us about BI Modernization? As it turns out, a lot. Each relies on advanced, intelligent data management to deliver unprecedented levels of efficiency and effectiveness. And both have the potential to completely revolutionize the way things are done.

Military aviation is a lifelong passion of mine, so it’s been with great interest that I’ve followed the development of the Lockheed Martin F-35 Joint Strike Fighter. And while that program has seen its fair share of challenges, the F-35 packs some undeniably revolutionary technology that gives it critical advantages over every other fighter on earth. The F-35 isn’t revolutionary because it flies faster or maneuvers better than previous generations of fighters (side note: it doesn’t.) It’s revolutionary because it completely shifts the paradigm of how it does its job, to the point where it simply doesn’t have to compete against legacy platforms on their same terms. The F-35 achieves this next-generation performance through a concept called Sensor Fusion, where information from multiple sensors is combined with advanced cognitive processing to allow F-35 pilots to see a clear picture of the entire battlefield environment in real time. In short, the F-35’s technology gives its pilots the capability to detect a threat, orient their plane to the optimal tactical position, and then decide to engage or avoid that threat before their opponents even realize they’re there.

So what does this have to do with BI Modernization? In business, as in aerial combat, mission planning and intelligence is as vital to success as execution. The faster you can paint a picture of your operational environment, identify obstacles to success, and take action to overcome them to achieve your objectives, the better. The F-35 paints this picture with sensor inputs while BI tools rely on various data sources, but in both situations it’s not the data inputs themselves but how that information is processed that makes for new levels of performance.

Rather than having a complex cockpit crammed with disparate displays from each sensor (which places an immense workload on the pilot who has to process, analyze, and prioritize this data) the F-35 integrates all this sensor data into one location and then uses its advanced processors to prioritize threats and fill in any missing gaps so the pilot is presented with a clear, actionable course of action. This is the same goal of BI Modernization. While there are a plethora of BI solutions that provide analytics and reporting, most are not designed to cope with new formats and higher volumes of data. Simply put, the amount and variety of data available to business users is outpacing their ability to distill it into actionable insights. What is needed is a “BI F-35;” a solution that replaces the overly complex BI “cockpit” of the past with a platform that leverages advances in cognitive processing to deliver the ability to take data from myriad sources and distill it into clear insights that inform tangible actions.

Categories: Companies

Usability and Functionality Still Reign in Analytics: Nucleus Research

3 hours 47 min ago

Once dominated by a handful of providers, the analytics market has become a crowded field in the last few years. But despite the diversity of solutions now available to them, for many users the task of evaluating their options often leads to more confusion than clarity.

However, as Nucleus Research reports in their 2016 Analytics Technology Value Matrix, customers still determine a provider’s value according to two simple criteria: usability and functionality. They want their solution to work well, and they want it to be easy to use.

Based on these two determinants, the Nucleus team created a “value matrix” classifying the market’s key providers into four categories:

  • Leaders: High usability, high functionality
  • Experts: Low usability, high functionality
  • Facilitators: High usability, low functionality
  • Core Providers: Low usability, low functionality

In evaluating each vendor, Nucleus looked at three critical components: embedded analytics capabilities, cloud deployments, and AI/machine learning.

GoodData as Leader

I’m pleased to report that in Nucleus’ evaluation, GoodData ranks among the Leaders (high usability, high functionality):

GoodData continues to be a leader in the analytics market, differentiating themselves by helping customers turn their data into a profit center. The company provides customers with highly usable analytics solutions, as well as a platform and services for organizations looking to monetize their data assets.

For a real-world case study, the Nucleus team reviewed MediGain’s deployment of the platform and confirmed that “GoodData played a key role throughout the entire go-to-market process in helping MediGain capitalize on its data.”

In conclusion, the team predicts that GoodData’s unique capabilities will fuel further growth in visibility and popularity:

GoodData can provide the analytics development and tools along with a Service Data Product team to build a complete GTM framework including market research, product design, pricing and packaging, and product marketing activities for those organizations that want to provide a data product quickly and may not have the internal resources available.

For more details about Nucleus Research’s analysis and criteria, I invite you to download the white paper here.

Categories: Companies

Mercatus: Driving Revenue Opportunities for Utility Companies

3 hours 47 min ago

When a power company considers a major new investment, a huge amount of data must be evaluated … and Excel spreadsheets just aren’t up to the task. Mercatus offers a better way, by providing a system that houses all this information and allows stakeholders to easily review the asset’s entire portfolio.

When Mercatus’ customers began requesting distributed analytics, Senior Director of Product Management Cathi Grossi and her team saw an opportunity. Creating a data product would enable them not only to deliver greater value and create upsell opportunities, but also to compile anonymous customer data for benchmarking. After briefly considering the “build or buy” decision, they decided to partner with GoodData. (Read the full case study here.)

“Building a distributed analytics solution is not the business we want to be in,” says Grossi. “We didn’t have the expertise needed, and there was no reason to reinvent the wheel.”

Thanks to Mercatus’ Pipeline Analytics product, customers can now see exactly where every project is in their pipeline, which helps them make more informed decisions, address issues, and become aware of potential disruptions. The analytics have become an integral part of the selling process, in addition to offering a unique upsell opportunity.

“Everyone wants it,” says Grossi. “We’re reinforcing the fact that when you buy Mercatus, you’re buying a business intelligence platform for the utilities industry.”

To learn more, read the Mercatus case study, “The Power of Project Insights.

Categories: Companies

Actionable Analytics – In Search of the Holy Grail

3 hours 47 min ago

So we load up our shiny new visual analytics platform and start slicing and dicing data, creating charts and graphs using the latest designer color schemes, and we might even add a dash of animation. And because we now have so many sources of data, the combinations of dimensions and measures is pretty well endless. As a result we start to see correlations and trends where none had been visible before. These ‘insights’ come thick and fast. Only there are two issues that spoil the party. Are those ‘insights’ really insights, and what do we do with them?

To answer the first question. It has long been documented that human beings invariably add their biases when analyzing data. If we want to find a trend, sure enough we will find one. Whether it is really a trend depends on a deep understanding of the business, and whether that rising line is in any way justified. If not, then it is probably nothing more than noise dressed up to look pretty. So we have to be very careful when digging around in data. We should understand that correlation is not causation, and that many of the ‘insights’ we gain might be nothing more than randomness made to look respectable.

The second question – what do we do with the insights – is complicated. Let’s examine what might need to happen to make an insight actionable:

  • Business managers must be convinced that the insight is real and can be trusted. It becomes necessary to create a story, presenting all the evidence, with explanations of why the various charts and graphs are the way they are. This might seem like it should be a straightforward process, but let’s remember that change usually involves treading on someone’s toes. So there will be resistance from those who will need to implement change, and detailed scrutiny of what is being presented, with ruthless exploitation of any holes in the evidence being presented. Think carefully before you poke a stick in the hornet’s nest.
  • If there is broad agreement that the analysis is correct, the next step is to decide how business processes need to change. The change might be trivial, or it might go to the heart of several business processes. If it is the latter then it may take months to redesign processes and get agreement from all involved.
  • Once implemented, the changes need to be monitored, and so new reporting and analysis procedures need to be put in place. Fine tuning might need to be made, or heavens forbid, we might find that matters have not improved in the way we expected. New analysis, with new ‘insights’ might suggest additional changes. It might be best to keep quiet about that one.

It should be clear from this that analysis is really just a small part of the insight-to-action journey. While suppliers of analytics tools flatter business users that their insights are important, the reality is somewhat different. Most insights will not be insights at all, but random noise making itself look respectable. Those insights that have been interrogated and found to be true mean nothing unless business processes can be changed, and this is where the real hard work is to be found.

Before we obsess over charts and graphs we should ensure that the results of analysis can be implemented in a controlled, documented, measurable manner. To this end standards such as Decision Model and Notation are being introduced, as a way to integrate decision logic into business processes. After all, making the results of analysis actionable generally means that people make different operational decisions.

Right now we are fascinated with the shiny new things – just because they are new and visually attractive. The real work consists of having people who know when randomness is trying to pull its tricks, so that analysis actually means something, and having the processes in place to transform significant insights into action. Anything else is just a waste of time.

Within half a decade we will have AI powered BI, and so the task of finding meaningful insights in our data will become easier. We will also have the methods and infrastructure to quickly move from meaningful analysis to modified business processes. Until then we need to be cautious and expect that actionable analytics will become more of a reality as the whole process becomes automated.

"This article was originally published by Butler Analytics"

Categories: Companies

Data-Driven BI Market Expected to Reach $26.5 Billion by 2021

3 hours 47 min ago

According to a recent article in EconoTimes, a new report by Zion Research shows that the data-driven global business intelligence (BI) market accounted for USD 16.33 billion in 2015 — and that it could reach USD 26.50 billion by the year 2021.

“Business intelligence helps companies make better decisions,” the article states. “[S]oftware will improve the visibility of processes and make it possible to identify any areas that need development.”

The Zion report goes on to identify the largest market for BI growth as North America, which accounted for 86% of the global market share in 2015, and lists GoodData among the major industry participants in this worldwide trend.

Here at GoodData, we were very interested by this report for obvious reasons, but we were also thrilled to see that several of our key strategies are being validated by market trends. We believe that true Business Intelligence is about more than dashboards; it’s about making actionable, data-driven recommendations that inform actions, and making sure those recommendations are embedded seamlessly into workflows to maximize their effectiveness.

The report supports this strategy, noting that BI is no longer tied to desktop computers: mobile business intelligence (MBI) now accounts for more than 20 percent of the global business intelligence market. Not only do mobile BI apps facilitate the viewing of data for an increasingly mobile workforce, but they also allow data captured via mobile devices to be incorporated instantly so that reports can be updated in real time.

Zion’s research also found that while on-premise analytics still account for 86% of the market share, this number “is projected to witness a decline in its shares by the end of the forecast period.” As a cloud-based analytics company, we have long expounded on the benefits of moving BI and data warehousing environments to the cloud. Among other benefits doing so speeds deployments, avoids capital expenditures on hardware infrastructure, simplifies software upgrades, and minimizes the need for IT involvement, and we’re very happy to see the increased speed of adoption for cloud solutions in the BI marketplace.

For more details, you can read the full article here.

To learn more about how to define and measure your data’s usefulness and monetary value, download the Eckerson Group report on Data Monetization Networks.

Categories: Companies

6 Tips to Hosting a Successful Customer Advisory Board

3 hours 47 min ago

As we enter the holiday season and people start talking about their favorite time of the year, I also like to reflect on my favorite work time of the year - GoodData’s biannual Customer Advisory Board. It’s so energizing to get a small group of thought leaders and visionaries into a room for 2 days to show off what they’ve built, talk about their plans for the future and discuss how in partnering together we can see those ideas to fruition.

I’ve overseen Customer Advisory Boards for the past 15 years with various companies I’ve worked with, and they’ve always been a highlight. This year’s CAB was no different, so without further ado and here are some of the key lessons I’ve learned.

  1. The product team owns the content. Everyone, and I mean everyone, wants to attend a CAB and have a speaking slot because it’s not often that multiple customers are in one room for 2 days at the same time. And really important information is shared. AND it’s also a chance to meet customers face to face. AND it’s a LOT of fun. But, other groups have a regular cadence of discussion with customers and this is the product team’s opportunity to hone in on the product roadmap. Customer’s want to share their visions and goals and influence how the product can help them meet those goals. I can’t tell you how many times we’ve shared a roadmap only to learn that the one ‘big’ thing we thought customers wanted was a lower priority than we thought. Prioritization exercises are great learning experiences.
  2. Have specific goals and agenda that will help you meet those goals. It’s easy to fall into the trap of wanting to get every question answered and show off what you’re building and planning. In order to have an effective CAB you need to focus. After the CAB you’ll have plenty of opportunities to continue the conversations now that a relationship has been established.
  3. Keep it small. The urge to invite every ‘important’ or ‘big’ customer to join is one that needs to be curtailed. Every customer is important but that doesn’t mean that all customers are comfortable sharing in groups, or will have interest in the particular product features you will be discussing. Smaller allows for more in depth discussion and participation. There is nothing worse than having members that don’t participate. You don’t want to waste anyone’s time.
  4. Invite customers based on your goals - not based on name, size or previous participation. This goes back to number 3. Every CAB has a different focus or goal and that might mean that some people who have participated in the past won’t for this particular meeting. That doesn’t mean they aren’t important anymore, it means that their use case isn’t one you will be discussing this time and it wouldn’t be a good use of their time. It’s hard to not see the same faces as you’ve built a rapport but hopefully you are still engaging them in feedback.
  5. Allow enough time for networking and sharing of ideas. It always surprises me how much time our customers want to spend seeing what others have built and sharing ideas among each other.
  6. Don’t talk - LISTEN. This is the HARDEST lesson of all but the most important. You are a facilitator of conversation and sharing of ideas. Customers do not want to be talked at - they could listen in to a webinar if that’s what they wanted. This is their opportunity to share their ideas with you. It’s important that you hear them and not feel the need to always respond. I’d also encourage you to consider inviting customers who may not currently provide a glowing reference. Some of the best ideas come from brainstorming together with a less than happy customer. And what better way to get them re-invigorated about the relationship than to have worked together to find solutions to their visions?

Frankly, I could go on and on about the benefits and joys of hosting Customer Advisory Boards. So I’ll end with a great big thank you to our current and past CAB members. Without them, we wouldn’t have built the amazing company that we have.

Categories: Companies

How Fourth Drove 117% ROI with Data Products

3 hours 47 min ago

For hospitality managers, cost control is a constant juggling act, involving inventory management, personnel, pricing, and a host of other factors. They have the data they need to make informed decisions; they just don’t have the time or resources to analyze and distribute it effectively, and most analytics packages are beyond their budgets.

Enter Fourth, the world’s leading provider of cloud-based cost control solutions to the hospitality industry. The company saw an opportunity to offer an analytics platform alongside its existing purchase-to-pay, inventory management, and other solutions.

“Our mission was to create a solution that everyone could afford and could take advantage of,” says Mike Shipley, who coordinated Fourth’s project with GoodData. “We wanted to empower our customers to run their businesses better — to make the information easy to access, and to offer it in a graphic, interactive interface that they can use everywhere.” (Read the full case study here)

In 2013, Fourth and GoodData partnered to design and launch a data product that was unlike anything else in the hospitality marketplace. The new platform allows customers not only to improve business performance by enabling decisions based on actual data, but also to automate distribution of the right data to the right stakeholders in an easy-to-read format.

One Fourth Analytics customer, GLH Hotels, was able to identify an opportunity to increase its margins simply by changing its staffing profile. “Now we have a greater percentage of employees on part time and flexible contracts, with the obvious effects on the bottom line,” says GLH’s HR Programme Manager Andrew Elvin.

Since Fourth Analytics went live, the product has delivered — and continues to deliver — a host of benefits to the organization, including

  • New revenue opportunities
  • Added value to current customers
  • Faster deployments
  • Thought leadership
  • ROI of 117 percent

To learn more, read How Fourth Added Analytics to Its Menu…and Realized a 117% ROI.

Categories: Companies

Turning Data Into Revenue-Generating Products: Four Steps to Success

3 hours 47 min ago

Over the next few years, the market for data-driven products and services is expected to grow by 26.4 percent — six times the rate of the tech sector as a whole — according to IDC Research.

Driving this growth is the realization that data products can create not only a new source of revenue, but also a unique competitive advantage. However, many companies become stuck on the “how” — how to create the product and launch with an effective go-to-market strategy.

This “how” is the subject of an article I recently contributed to Data Informed, “4 Steps to Designing and Launching an Effective Data Product.” If its data product is to succeed, the organization must Determine goals and develop a user-led product design

  1. Establish pricing and packaging
  2. Roll out a successful launch
  3. Continue to add new features and functions

For a detailed description of each step — and a real world case study — I invite you to read the full article here.

To learn more about choosing the right embedded analytics solution for your organization, download the report Which Embedded Analytics Product is Right for You?

Categories: Companies

How The Election Shined a Spotlight on the Limitations of Predictive Analytics Technology

3 hours 47 min ago

The results of this week’s presidential election came as a shock to many people. But as I sat glued to CNN, watching as more and more states flipped from Blue to Red, all I could think of was how this election stands as a stark reminder of the risks that come with predictive analysis technology.

Politics, just like every other industry, has become enamored with the power of big data. And as political analysts crunched the numbers and ran predictive data models over the past months a Hillary Clinton victory seemed almost assured, with the major vote forecasters placing her chances of winning between 70 and 99 percent. Of course we all now know how wrong those predictions were.

The promise of predictive analytics is an intoxicating one, and it’s easy to see why. Traditional analytics technology is primarily retrospective, but radical advances in AI and computing power have promised to shift this paradigm to a forward looking one. Using machine learning and predictive analysis, it’s now possible to come up with reasonably accurate predictions about what will happen in the future. But this new wave of technology comes with heightened stakes and greater risks, and the near total failure by political experts to foresee Donald Trump’s victory perfectly illustrates the pitfalls of flawed assumptions, wide margins of error and lack of context when it comes to predictive analysis.

The ability to use technology to gaze into huge amounts of data to discern upcoming trends and inform future actions can be a massive advantage, and I do believe that AI will transform the business landscape especially as far as business intelligence and predictive analytics are concerned. But this technology is young and still developing, and predictive algorithms are not magic. They rely on data models that can be based on flawed assumptions, and if those models are wrong the results can be disastrous. Just look at the event known as Knightmare, where the Knight Capital Group lost $440 million in 30 minutes when their trading algorithm went haywire.

My point is that predictive analytics, while potentially game-changing, have limitations. And as professionals from every industry become ever more reliant on these tools to make increasingly important decisions, it’s more essential than ever to appreciate and account for these limitations. Predictive technology gives probabilities, not guarantees. That is why there will always be a need for experienced, insightful experts to plug the gaps left by this type of technology. Only by applying years of experience and expertise can one take those probabilities and apply the intuition, creative thinking and context needed to make truly informed decisions. As Steve Lohr wrote in a recent New York Times article, “data is the fuel, and algorithms borrowed from the tool kit of artificial intelligence, notably machine learning, are the engine.”

But we still need to be in the driver’s seat, remembering to never take our eyes off the road.

Categories: Companies