Skip to content

Feed aggregator

How AI Is Transforming the Customer Experience

Good Data - 4 hours 19 min ago

Businesses are facing new challenges when it comes to attracting and retaining customers. Empowered consumers have seized control of their relationships with brands, and many of the tried-and-true approaches to reaching them are no longer effective. To keep pace with this transformation, forward-thinking companies are increasingly leveraging the power of artificial intelligence (AI) to elevate their customer experiences.

GoodData CEO Roman Stanek recently shared with Information Age his insights on how AI and machine learning can help companies increase conversions, reduce customer churn, and improve those all-important customer satisfaction numbers.

Increasing Conversions: Traditionally brands have turned to A/B or multivariate testing to solve their conversion challenges. This process can take weeks or months to yield actionable results — too long for companies who constantly need to stay competitive. AI allows businesses to track in real time which visitors turn into prospects or customers, which means user experience teams can test a huge number of ideas in a short amount of time and take action immediately.

Reducing Customer Churn: Using machine learning, companies can create a historical model that lets them predict which customers are more likely to stay and which ones are in danger of leaving. Once their data is cleaned up and optimized for this purpose businesses can take a proactive rather than reactive approach, reaching out to customers in the “danger zone” with tailored offers and communications.

Boosting Customer Satisfaction: Companies have historically struggled to get inside the minds of their customers, usually turning to tedious surveys to get the information they need. With AI, companies can offer deals in exchange for customer data monitoring and gain access to information on website clicks, social posts, mobile device usage, and more. In addition, natural language processing capabilities can monitor sentiments on social media posts, parsing positive promotions from complaints.

A one-size-fits-all approach doesn’t work with today’s customers. Brands must offer a personalized experience, and AI gives them the capabilities they need to make each buyer experience unique. Brands that harness the power of these new technologies are laying the foundation for attracting more new customers ... and ensuring that they stay.

Read the full article on the Information Age site here.

Categories: Companies

The Key to a Great Customer Video: Fostering Partnerships

Good Data - 4 hours 19 min ago

Creating customer videos has been on my Wish List for quite some time, and recently we launched our first customer video featuring none other than ServiceChannel! I love working with ServiceChannel, as they are always looking for new ways to disrupt their industry and are happy to share their success through various co-marketing activities. Ours is truly a partnership, and I consider myself very lucky to have so many partners that want to work with me on different projects.

The timing for my video project was perfect, as I learned that ServiceChannel was building a Smart Business Application using Machine Learning for their Decision Engine that would be launching early Q2/17. That gave us time to work together on a launch plan so that both companies could share the news about ServiceChannel distributing machine learning capabilities to their customers at the same time, creating maximum impact.

Given that ServiceChannel is changing the way facilities managers and contractors work, analysts are always asking to speak with them. Along with creating a video about their growth with GoodData and addition of Machine Learning, we also worked with Blue Hill Research on the report, Beyond Self-Service: How Machine Learning Drives Enterprise Data’s Third Wave with input and content from ServiceChannel’s VP of Marketplace Strategy & Experience, Sid Shetty.

We’re following on with a webinar featuring Blue Hill Analyst Toph Whitmore and Sid Shetty on June 14 at 11AM Pacific. So much content created from our initial discussions around a customer video. Stayed tuned for our next customer story coming soon!

 

 GoodData Guidebook

Categories: Companies

Enterprise Data is Coming of Age, Thanks to Machine Learning

Good Data - 4 hours 19 min ago

As more organizations embark on their digital transformations, the fundamental role of data across all departments and roles has come under the spotlight. In a recent report published by Blue Hill Research, Principal Analyst Toph Whitmore reveals that digital transformation as it concerns data is not, as some may see it, a “once and done” event, but one that involves three distinct phases:

The Three Phases of Digital Transformation

Whitmore explains that, in the first phase of digital transformation, Commodity Storage, enterprises gather data, store it, then lock the door and hand over the keys to their IT departments. In Phase 1, the average business user is totally walled off from enterprise data:

“For example, the business analyst submits a data request, the SQL analyst parses it and hands it off to IT, IT eventually processes the request, then passes data back to the business analyst, who in the ensuing three weeks has moved onto something else.”

 

At this stage the process is slow, cumbersome, and only effective in cases where weeks- or even months-old data is still valuable to the end user.

 

As users start to demand better, faster access to the timely information they need, the enterprise moves into the second phase, Self-Service Everything. No longer siloed in the IT department, data becomes an accessible asset, placing valuable information (ideally) at users’ fingertips when and where they need it. In this phase, as Whitmore puts it, “[d]ata access becomes (seemingly) immediate, offering faster time to insight, and the promise of faster/better decision-making.”

Phase 2, where most enterprises find themselves today, presents its own set of limitations. Many self-service platforms are cumbersome for non-technical personnel to use, and few provide the necessary context business users need to do their jobs more effectively. Furthermore, as Whitmore points out, “self-service delivery can outpace IT’s ability to govern data. As data volume grows (think IoT), business users’ ability to consume it in a timely fashion diminishes.”

When this friction reaches critical levels, forward-thinking enterprises will be forced to move on to a third phase of digital transformation: Machine-learning Ubiquity. To cope with the rising flood of data and remain competitive, enterprises will leverage artificial intelligence (AI), machine learning, and automation to deliver data-driven insights in context and within users’ existing workflows.

Progressing to this phase allows enterprises to accomplish three key goals:

  • Speed time to insigh, by embedding data technology at the point of work
  • Reduce risks associated with manual process delivery by applying machine learning to suggest actions or automate tasks
  • Avoid process “lock-in,” by employing AI and benchmarking to let systems “learn as they go”
Machine Learning in Action: ServiceChannel

ServiceChannel provides facility managers with a single platform to source, procure, manage and pay for facility maintenance and repair services. Having passed through the first two phases of digital transformation, the organization is now working with GoodData to offer its customers a machine-learning-based decision engine.

“Going forward,” explains ServiceChannel VP, Marketplace Strategy & Experience Sid Shetty, “when our customers view proposals from their service providers, ServiceChannel's Decision Engine will recommend an action, as well as provide supporting intelligence so that customers can make the best data-driven decisions possible."

The (R)Evolution Continues

As the volume of enterprise data continues to grow exponentially and competitors turn up the heat, organizations can no longer afford to be stuck in the second phase of digital transformation. Business users across the enterprise require data-informed insights in context, where they work, in a way that leverages automation to let them focus on what they do best. Smart organizations are heeding the call to evolve, and others must follow suit or be left behind.

For more insights on how machine learning is driving the evolution of enterprise data, download the Blue Hill Report “Beyond Self-Service: How Machine Learning Drives Enterprise Data’s Third Wave.

Categories: Companies

Key Strategies for Profitable Business Analytics

Good Data - 4 hours 19 min ago

Business analytics serve only one purpose - that of helping people make better decisions. These decisions might occur at the level of transactions, tactical operations, and strategy. Business intelligence for example is largely concerned with analysis of prior performance and support for diagnostics. It is mainly used to support tactical decision making by managers at various levels in the business, and is usually not useful for transaction based decisions or strategic decisions. These latter generally tend to consider macro factors such as economic conditions, competitor activity, market trends and so on.

The business analytics space is becoming quite crowded, with machine learning, prescriptive analytics and artificial intelligence adding to the analytic mix. Machine learning concerns itself mainly with applying algorithms to historical data, in an attempt to detect patterns of behavior that might be useful in future activities. In loan approval for example, we interrogate historical data looking for characteristics that might indicate a loan applicant will have no problem repaying a loan. Many loan approvals are now processed automatically with very little human involvement. Clearly there is considerable scope here for adding intelligence to operational applications dealing with customers, suppliers, employees and trading partners. As such the intelligence needs to be embedded into these applications so they are available at the point of work. This is also true of business intelligence, and particularly the embedding of various visual artifacts (charts, graphs, dials etc) into the applications that are used day after day in a production setting.

The current fascination with all things visual is understandable, but a business will not realize the efficiencies that business intelligence and machine learning can deliver, until analysis is embedded into production applications. Such analysis can speed up processing of transactional activity, and ultimately will completely automate a good deal of it.

Prescriptive analytics is fundamentally different from BI and machine learning in that it establishes how processes should execute to make best use of resources. BI and machine learning are concerned with what has happened or will happen. Prescriptive analytics is concerned with how to best use resources given various forecasts and plans. The optimal deployment of sales reps, factory scheduling and even product pricing are all activities that can benefit from prescriptive analysis, and optimization technologies are key to this.

That the big payoff from analytics comes when the analysis is embedded into production applications is well demonstrated by the emergence of the Decision Model and Notation (DMN) standard. This provides diagrammatic and integration methods for embedding decisions into business processes. The decisions may be manual, semi-automated to fully automated. What is important is that the decision logic can be integrated into a production environment.

Finally we come to artificial intelligence (AI), a broad collection of technologies and methods that support automation of processes and decisions. Machine learning happens to come under the umbrella of AI, and more advanced techniques such as deep learning are enabling highly accurate image recognition, language processing and speech processing. Again AI does not sit in a world of its own, but is most useful when it can assist with and possibly automate day to day operational activities. Examples include chatbots that interact with customers via messaging platforms for customer support and helpdesk applications. AI is also being used to automate business process improvements and execute real-time marketing optimization.

In all of this the key issue is that analysis and intelligence should be integrated into the working environment. Some suppliers call it smart applications, presumably in contrast to dumb applications that do little more than act as system of records applications. The emergence of Internet of Things (IoT) data streams only emphasizes the point that analysis needs to execute in the production environment, and of course that the inevitable trend is toward real-time analysis.

Much greater demands will be made on infrastructure and integration technologies, and on the various skill sets within the organization to work together. And critical to all of this is the creation of processes that allow analytical models to be designed, built, implemented, monitored and modified, so that the organization has a firm grip on the decisions these models are making. Anything less and a business may well find that analysis has become out of sync with reality, and all that implies. The technology is the easy part. Disciplines, culture and methods as always will be the real challenges.

This article originally appeared on Butler Analytics

To learn more about building profitable business analytics, check out GoodData’s webinar with Martin Butler

Categories: Companies

GDPR: Ignore It at Your Own Risk

Good Data - 4 hours 19 min ago

If your company does business in the European Union, you are likely to face a major overhaul of the way you handle your customer data. That’s because in 2016, the European Parliament passed the EU General Data Protection Regulation (GDPR), a sweeping change that will affect all companies doing business with EU residents, regardless of where the companies are based.

To understand the GDPR, it helps to understand the European view of privacy. In Europe, unlike in the United States, personal privacy is seen as a fundamental human right rather than just a consumer protection issue. In the interest of protecting this right to privacy, the EU is mandating that as of 25 May 2018, all companies doing business with its residents must

  • Have a valid reason for collecting and using all forms of personal data
  • Obtain consent for any use of data outside of certain pre-approved conditions
  • Present requests for consent to use personal data “in an intelligible and easily accessible form”
  • Notify authorities within 72 hours of any breaches that could compromise personal data
  • Be able to fulfil all privacy rights, including data erasure
  • Nominate a Data Protection Officer (DPO) (only required for companies monitoring data on a large scale or handling special categories of data such as criminal records)

As you may imagine, this regulation will be a game changer for thousands of companies around the world. The upside is that we still have a year to get ready. The challenge is that it will take time, effort, and yes, money to comply with this regulation that many businesses still don’t know about … and still fewer understand. Let’s look at the answers to a few of the most common questions arising from the business community about GDPR.

Which Businesses Will Be Affected?

Regardless of where your company is based, if you handle personal data of EU residents (not just citizens), you will be required to comply with GDPR. The EU defines personal data very broadly as any information that can be used to identify the individual, directly or indirectly, from innocuous information like names, email addresses, and physical addresses to social media posts and online presence footprints to highly sensitive medical and financial information.

How (and When) Will the GDPR Be Enforced?

All requirements of the GDPR will go into effect on 25 May 2018. For the moment, officials are expecting that, as with HIPAA in the United States, most compliance checks will be done via supply chain management. If you work with third parties who process data on your behalf, the GDPR expects that you will assess those companies’ compliance with its requirements.

Apart from defining the fines, the EU has told us little about how it plans to enforce the regulation. Personally, I expect that shortly after the enforcement date, officials will begin performing audits, and I believe they will begin with small-to-medium size businesses. Large companies with sprawling compliance departments and dozens of lawyers will be tricky to go after, whereas targeting a business without a large back office makes it easier to set a precedent and demonstrate that the EU is serious.

What Are the Penalties for Noncompliance?

The maximum fine for the most serious infringements will be up to 4 percent of global revenues or €20 Million, whichever is greater. Lesser infractions will be subject to smaller penalties; for example, a company’s fine for not having its records in order will be 2 percent of global revenue.

How Can We Prepare, and How Much Will It Cost?

The first step in preparation is to fully understand the regulation’s requirements and how they will affect your business. For starters, visiting eugdpr.org will give you some solid insights into the regulation as well as into its background and controversies.

Once you have a clear picture of how your business will be affected, I recommend the following steps:
  1. Appoint a cross-functional GDPR task force, reporting to the executive team.
  2. Make a map of how you collect and use personal information from EU residents.
  3. Use the map to assess compliance with GDPR requirements, and make a plan to fill any gaps by May 2018.
  4. Assess all vendors who handle personal data on your behalf, and work with them as needed to ensure compliance by May 2018.

The cost of all this will depend on the size and scope of your company, the nature and sensitivity of the data you handle, your current level of compliance, the number of vendor relationships that will be affected, and so on. Once you’ve determined what needs to be done, you’ll want to assess the costs and work them into your budget.

So, what will the global business environment look like after 25 May 2018? I believe that for large enterprises, apart from additional paperwork, little will change: Many have large compliance teams and have already implemented similar measures as part of their standard practices. Startups, on the other hand, will face a serious burden as they will have to comply from Day One of their existence, and I can see this causing the EU to fall behind United States when it comes to business innovation. One thing we can all be certain of is that the GDPR will change business as we know it, and the best we can do is make sure we’re prepared.

 

About the author:

Tomáš Honzák serves as the head of security, privacy and compliance at GoodData, where he built an Information Security Management System compliant with security and privacy management standards and regulations such as SOC 2, HIPAA and U.S.-EU Privacy Shield, enabling the company to help Fortune 500 companies distribute customized analytics to their business ecosystem.

 

This article originally appeared at Infosec Island

Categories: Companies

Beyond Dashboards: Why You Need Insights that Drive Actions

Good Data - 4 hours 19 min ago

The days of “data for data’s sake” are over. Forward-thinking organizations are recognizing not only the need to translate data into strategic insights, but to make those insights more accessible, understandable, and relevant for business users at all levels.

In their report “Insight in the Moment: Analytics Embedded at the Point-Of-Decision,” the Aberdeen Group explores the forces driving organizations to take this next step in the evolution of analytics, including employee demand for a more seamless, contextual data experience:

“Users are demanding that [analytical] capabilities are delivered in context, where they work, with a better framework of how and why the insight is important as well as how it can be used.” Dashboard Crash: Moving Beyond Self-Service Analytics

“Self service” and “data democratization” have become the mantras of organizations seeking to reduce users’ reliance on IT for the data they need. But as GoodData CEO Roman Stanek recently commented in Data Center Knowledge, “the ability of these tools to provide clear insights is rapidly degrading. And simply trying to cram more information into a dashboard will only compound the problem.”

The first challenge many organizations encounter with the self-service approach is where the insights are located. When users are forced to click away to a separate dashboard to access data, a disconnect arises between the information and the organization’s everyday workflows.

The second challenge involves how the analytics are presented: too many tools present business users with reams of raw data. Not only do users not have the time to dig through all this information, but they also lack the data science background that would be needed to identify intricate patterns and distinguish the strategic insights from the “noise.”

Welcome to the Future

As organizations struggle with the limitations of self-service analytics, new tools are delivering the more streamlined, contextual experience that users demand.

By advancing from analytics to insights, these tools take on the “heavy lifting” involved in interpreting the data for users’ specific needs. Advanced platforms now harness machine learning and artificial intelligence to automate analysis of the data and truly offer actionable insights.

By offering data-driven insights at the point of decision, these platforms integrate intelligence not only within the applications that business users access every day, but also within everyday workflows. As the Aberdeen Group discovered in its survey, embedding insights offers a higher degree of user satisfaction compared to the standalone approach:

Insights in Action: Fourth

When GoodData client Fourth identified a need for enhanced analytical capabilities, they also saw a need for customizable data views to fit the specific needs of each user across a diverse customer base. The new platform enabled users to identify areas for cost reduction in the value chain by recognizing interdependencies among the data. As the Aberdeen Group reports, one Fourth customer was able to reduce labor costs as a percentage of sales by 1 percent a year, a significant reduction for the restaurant chain.

Insights at the point of decision is more than an interesting concept — it represents the future of operational data analysis as organizations know it. By offering actionable, real-world intelligence within the applications that organizations are already using, advanced platforms are empowering users with the strategic insights they need to reduce costs, grow profits, and realize unprecedented performance benefits.

For more insights on the evolution of embedded analytics, download the Aberdeen Report “Insight in the Moment: Analytics Embedded at the Point-Of-Decision.

Categories: Companies

Financial Companies Need to Look Beyond Analytics and Focus on Insights Platforms

Good Data - 4 hours 19 min ago

Here’s a startling statistic: 80% of corporate business intelligence projects fail. These initiatives don’t fall short because servers melt down or lots of data isn’t successfully crunched; they fail because after long and often expensive implementation cycles, they produce no concrete value for the enterprise. Put another way, they fail to meaningfully transform a business process or processes in a way that creates competitive advantage.

I recently had the opportunity to present at Finovate Spring in San Jose, California. Some of the hottest companies in the fintech sector were in attendance. But I was struck by the fact that, while most of the companies in attendance would consider themselves among the “data elite,” many struggle to make the transition from data capture and analysis (R&D) to outcomes (Production systems that create competitive advantage).

I believe this is because they rely on tools and systems that put the burden on business users to uncover insights and take action. As the volume of raw data that financial enterprises generate as part of their daily operations grows exponentially, the ability of legacy analytical tools to provide clear insights is degrading rapidly. And simply trying to cram more information into a dashboard will only compound the problem by presenting too many signals for most business users to understand and derive meaningful insights from.

Fortunately many of these companies have started to recognize this issue and search for solutions. I couldn’t help but notice a huge emphasis on machine learning and Big Data at Finovate, specifically around how these technologies can be applied to drive better outcomes for financial institutions. This has been a massive area of focus for GoodData; as a company, we have worked tirelessly to develop a purpose-built platform that combines internet-scale data integration, data enrichment, machine learning, and automation recipes to deliver actionable insights directly to your business users at their point-of-work. Our goal is to close the data-insight-action loop.

These “Systems of Insight” go far beyond the capabilities of legacy analytics platforms. Rather than just looking back to past data to infer trends, they leverage advances in machine learning, AI and predictive analytics to deliver insights where and when they are needed most to recommend actions, drive outcomes, and ultimately change the way companies use their data to do business.

To learn more, please check out the video below of my presentation at Finovate 2017.

 

 

 

Categories: Companies

What if We Were ALL Wrong about Data Democratization?

Good Data - 4 hours 19 min ago

As the volume of raw data that businesses of all types generating as part of their daily operations grows exponentially, the idea of giving employees throughout the organization the ability to derive insights and value from this information is extremely attractive. However, as the flood of data continues to increase, the ability of legacy analytical tools to provide clear insights is degrading rapidly. Simply trying to cram more information into a dashboard will only compound the problem by presenting too many signals for most business users to understand and derive meaningful insights from.

The focus in the BI industry over the past several years on delivering so-called “Self-Service Analytics” has made it difficult for business users to keep up with analytics and predictive monitoring in addition to their core job duties. Not every business user needs access to self-service analytics tools; they don’t have the time and expertise to parse through and make sense of all the data available to them. And more importantly, I would argue that they should not have to. We don’t need more data in the hands of more people; we need better insights in the hands of more people.

I recently had the opportunity to discuss this concept in detail in Data Informed. While the idea behind data democratization isn’t a bad one, truly fulfilling the promise of Big Data requires getting the right information to the right people at the right time, in a way that they can use it. Achieving this, and making full use of the mountain of structured and unstructured data that your organization generates, will require evolving our business intelligence tools. What is needed is a new generation of analytical systems that take the heavy lifting out of data analysis to automatically give employees them the right insights they need to focus on their core responsibilities, make better decisions and drive more impactful business actions.

Read the full article in Data Informed here.

Want to learn more? Download Aberdeen’s report on Analytics Embedded at the Point-of-Decision

Categories: Companies

Systems of Insight and AI: The Future of BI

Good Data - 4 hours 19 min ago

Lately, I’ve been thinking a lot about how I define Business Intelligence and where I see the industry heading. More and more, this connects back to embedded AI, machine learning, predictive analytics, data enrichment and other AI methods, but at the end of the day we all need to be on the same page.

At its core, BI is all about demonstrating profitable activity with analytics and enabling businesses to use their data to drive actions and outcomes. The popular thinking is that self-service BI tools are easy to use and good for the business, and business users like the idea of having their own data analysis tools. But what organizations are failing to realize is that they will be far more productive and profitable if they use AI and machine learning to automate the mundane decisions that most people use current BI tools to make and allow their employees to focus on more strategic problems.

I don’t believe that the concept of self-service analytics is a scaleable one. Employees should be spending their time on their core job functions, instead of slicing and dicing data. Embedding analytics at the point of work and automating mundane decisions with machine learning and AI enables business users to take immediate actions to improve business outcomes. Employees can focus on their day-to-day responsibilities, while letting machine learning take care of automating tasks that don’t require the expertise, experience and context that only a human can provide.

I’ve been inspired by JP Morgan’s COIN program, it’s an outstanding example to showcase how a business can leverage machine learning to automate decisions, and in the process save themselves millions of dollars a year. COIN is a learning machine that automates in seconds the mundane task of interpreting commercial-loan agreements that, until the project went online in June, consumed 360,000 hours of work each year by lawyers and loan officers. This perfectly illustrates the business value that machine learning can have.

Both startups and some of the world’s leading technology companies alike are pouring investment into AI technologies and machine learning capabilities, and those that aren’t are behind the curve. Tractica predicts the market for enterprise applications of AI to surpass $30 billion by 2025, with a focus on better, faster, and more accurate ways to analyze big data for a variety of purposes. Those investing in advanced AI and machine learning capabilities will lead the BI industry as it moves more and more towards automation.

Right now, the BI industry remains focused on self-service capabilities, but I think there should be less emphasis on self-service flexibility and more on automation and AI. At GoodData, we are focused on creating solutions that support embedding AI at scale to automate the basic business decisions that people mostly use BI tools for, which is why we will keep investing in predictive analytics, machine learning, data enrichment, and other AI methods.

GoodData wants to be part of the production environment, and we believe that the best way of doing that is deploying Smart Business Applications that harness the above technologies to make organizations more productive by allowing employees to spend less time analyzing data and more time focusing on daily tasks. Smart Business Applications involve the seamless integration of BI tools with day-to-day business applications and workflows. They work by delivering relevant, timely, and actionable insights within the business applications where work is already being done, so decision makers no longer have to stop what they’re doing and open another app to get the insights they need. By putting these intelligent insights right in front of them the moment they need them, GoodData is ushering in the next generation of BI.

Categories: Companies

From Descriptive to Prescriptive: How Machine Learning Is Fostering a New Era of Analytics

Good Data - 4 hours 19 min ago

Imagine going to your doctor with a problem. He gives you a thorough checkup, looks at the results, and tells you, “You have a bad case of [fill in the blank].” Then he pats you on the shoulder and says, “Good luck with that.”

It sounds ridiculous, but that’s what many organizations have been getting from their analytics and Business Intelligence platforms: a diagnosis with no prescription. Today, thanks to advances in machine learning and artificial intelligence (AI), a new era of BI — one that incorporates predictive, prescriptive, and cognitive computing as well as automation — is now possible.

In a recent article in BetaNews, GoodData Vice President of Product Bill Creekbaum explores the new BI capabilities that machine learning and AI have enabled:

Data is being analyzed faster and more accurately with these advanced analytics frameworks, and decisions are being automated with machine learning to decrease human error and increase the organization’s bottom line profit. Machine learning can detect new patterns and opportunities that humans cannot.

 

When organizations implement these advanced frameworks, analytics are no longer a matter of “what is happening?” but of “what will happen next … and what can we do about it?” By embedding analytics with AI capabilities at the point of work, they can deliver insights at the time and place where they can drive better business actions. And they can automate simple, routine decisions, allowing business users to focus on solving more complex problems.

Despite the tremendous advantages that AI-powered analytics offers, Creekbaum notes that adoption has been slow. According to a recent survey, only 28 percent of respondents reported having experience with machine learning, while 42 percent said their organizations lack the skills needed to support it.

How long can organizations continue to avoid integrating AI into their BI frameworks? Not long, according to Creekbaum:

Legacy BI platforms can no longer keep up as the world becomes more digital and automated. In order to remain competitive, businesses must be looking toward BI solutions with AI and machine learning capabilities. AI is the future for BI, and those that aren’t investing will lag behind the competition.

 

For more of Creekbaum’s insights, read the article on the BetaNews website.

Categories: Companies

Artificial Intelligence is the Future of Business Intelligence

Good Data - 4 hours 19 min ago

A major focus for Business Intelligence (BI), data and analytics vendors has been delivering tools that are ‘self-service.’ These solutions are based on the premise that with just a few easy clicks, users of all skill levels (regardless of their ability with data and analytics) can quickly sift through volumes of data to make radically better business decisions. But there’s a big problem; as the flood of data that businesses generate as part of their normal operations continues to increase exponentially, the ability of current tools to provide clear insights is rapidly degrading. And expecting everyone in the organization to become data scientists and business experts, in addition to performing their core business responsibilities, is simply unsustainable.

"By 2018, most business users will have access to self-service analytical tools,” said Anne Moxie, a senior analyst at Nucleus Research. “But the fact remains that there’s too much data for the average business user to know where to start.”

The fundamental issue is that even with these ‘self-service’ tools, most business users lack the analytical skill to reliably identify the intricate patterns in data that could be important information, or nothing at all. And more importantly, they shouldn’t have to.

In a recent article in Data Center Knowledge, GoodData CEO Roman Stanek argues that radical advances in computing power, predictive analytics, machine learning and artificial intelligence have opened the door to a new generation of analytic tools. If implemented properly, these systems can finally make good on the promise of Big Data by instantaneously pulling actionable insights from complex data sets and automatically surfacing them as recommended business actions, where and when they are needed most.

This isn’t some far off science fiction future. Machines are already getting better at extracting insights from complex data than humans are, and these cognitive abilities will only improve. Rather than continuing to pour money into tools that require employees to spend their time manually analyzing data and making mundane decisions, business organizations should be investing in next-generation systems that automate the bulk of these processes and allow their talent to focus on the strategic problems that really move the needle.

To learn more about the intersection of artificial intelligence and business intelligence, check out Roman Stanek’s article in Data Center Knowledge.

Download the Nucleus Research GoodData Guidebook to learn more about the next generation of predictive analytics and Smart Business Applications.

Categories: Companies

When Bridging the IoT Skills Gap, Product Teams Lead the Way

Good Data - 4 hours 19 min ago

Has the technology of the Internet of Things (IoT) evolved faster than the skill sets required to leverage it in achieving business objectives? Given the scarcity of job postings with “IoT” in the title, it would appear so.

In a recent article, Internet of Business columnist Adam Bridgwater posed this question:

“With a looming skills gap in front of us, how will we consolidate skills in the software engineering realm of developers, architects and device specialists to provide for our IoT-enabled future?”

For answers, he consulted executives from three software industry leaders, including GoodData CEO Roman Stanek.

Product Management Leads the Way

In his response to Bridgwater’s question, Stanek explained that the skills needed for IoT success fall squarely within the product management role, where one finds awareness of the vision, strategy and security needed to handle IoT data effectively. The new skills required to execute on IoT initiatives will fall into two categories:

Broadening the Scope

While IoT strategy should be owned by product management, in close collaboration with the security team, multi-functional collaboration across teams is key for IoT success.

“IoT needs to be collaborative because traditional expertise needs to be extended,” Stanek said, in the same way that today’s auto industry requires knowledge that goes far beyond carburetors and cylinder blocks.

For more of Stanek’s insights, read the article on the Internet of Business website.

Want to learn more? Click here for a webinar on Profiting from Actionable Analytics.

  • “How to Design”: How IoT fits into the product
  • “How to Protect”: The security aspect of handling data from IoT devices
Categories: Companies

From Data to Insights to Action: How Smart Business Apps Are Closing the BI Loop

Good Data - 4 hours 19 min ago

Not so long ago, the user experience of any app was determined by the technology. We all used Outlook because at the time it was the only option available, and users had to either adapt to the interface Microsoft gave them or … well, there was no “or.” Until Gmail showed up and showed us all that there is a better way.

Today, the user is in charge. Features like intuitive design, personalization, and voice control have become standard in consumer-facing technology (think Siri, Alexa, Uber/Lyft.) Users are now expecting similarly tailored experiences from their business applications. And with so many options available in all categories, they aren’t “stuck” with apps that fail to deliver the experience they want.

As data professionals, the question we need to ask ourselves is this: “Are traditional BI apps user-centric enough to meet the demands of today’s decision makers? And if not, can they make the leap?”

BI Yesterday and Today

Before we dive into that question, let’s take a look at the evolution of business intelligence. The very first BI apps were developed when companies had limited amounts of stored data. Their basic function was to centralize information and deliver “dashboards” to serve as a readable interface between users and raw data.

Eventually, BI evolved into standalone apps that data analysts or BI team members used to perform ad hoc analytics and distribute the results to decision makers. These apps were more sophisticated in terms of the data they delivered, but were still not capable of driving action. In other words, they were still static dashboards, and executives could look at them all day long and still not understand which important changes were taking place … or which actions they needed to take.

Today, enterprises have data coming at them from all sides: big data, unstructured data, data from the “internet of things” (IoT), and data from countless third-party systems. For present-day decision makers, static templates with predetermined sets of questions are no longer enough. They need a contextually aware solution.

Enter the third phase of BI evolution: smart business applications. The ability to gather, store, and report on data is yesterday’s news; now it’s all about making the data work for the user.

Why Smart Business Apps?

Smart business applications represent the integration of BI tools with day-to-day business applications and workflows. They deliver relevant, timely, and actionable insights within the business application where the work is being done. Decision makers no longer have to stop what they’re doing and open another app to check the data. Now the data is right there in front of them the moment they need it.

Smart business apps release data from the exclusive realm of data scientists and BI teams, making it readily available throughout the organization to service the needs of both the ‘data elite’ and business users.

Let’s look at a real-world example.

Say an organization uses a fictional app called “ExpenseHero” for their travel and expense management. Here’s what the workflow within the app looks like:

  1. An employee submits an expense.
  2. An auditor performs simple rule-based QA tasks such as checking whether the amount on the receipt matches the line item in the report.
  3. The request is forwarded to a manager, who approves or rejects the expense.

Now, we all know that managers are busy — really busy. What are the odds that they have the time to thoroughly review every expense request that comes across their desks? Practically nil, and that’s where far too many companies lose far too much money. It’s not about employees committing fraud or acting irresponsibly; it’s about the lack of sufficient time and resources to enforce the guidelines that are in place.

Managers may well have access to dashboards telling them what their spend has been this quarter, how much of their budget is left, and who their top 10 “spenders” are. What the dashboard doesn’t tell them is which actions they need to take and when to take them. It’s a good “pulse check” to look at on a monthly or quarterly basis, but it has little to no use in making day-to-day decisions.

So, how can we make this workflow “smart” in a way that enforces good hygiene in expense reporting without adding to the manager’s workload? Here’s what that might look like:

  1. The manager opens the expense app and sees the requests that need to be approved or rejected.
  2. Instead of opening a separate app to see the data that can inform her decisions, she sees a small recommendation window as she hovers over each request. The window offers a recommendation to accept or reject the expense along with justification for the recommended action.
  3. The manager accepts or rejects the requests on her screen, confident that her decisions have been guided by actual data.

This little recommendation window may not seem impressive, but it represents a tremendous amount of activity going on behind the scenes. The app probably has to run a machine learning algorithm to create that recommendation, taking into account a wealth of data to create a robust predictive model. For example, if John Smith typically submits expense reports in the range of $4,200 to $5,600, but suddenly he submits a request for $7,200, the app can recommend either rejecting the expense or giving it a closer look.

To deliver a reliable recommendation, apps must often reach outside the company for relevant information such as census data and details on key events. I’ll give you an example from my own experience: I once had to travel to Minnesota on business, and my trip happened to coincide with the Ryder Cup golf tournament. If you know anything about the hotel industry, you know that room rates can go up quite a bit during major events like this one. Sure enough, my room expenses were well outside my normal range, and the finance team rejected them.

Smart business apps can also tap into your CRM data to further refine their recommendations. Getting back to John Smith and his $7,200 request, let’s say that John is your top sales rep and that the size of his close amounts more than justifies the expenses he incurs as part of the sales process. Once the system “learns” this about John Smith, it can automatically adjust the parameters for accepting or rejecting his requests.

It’s an exciting time for business intelligence. With the arrival of smart business applications, we are closing the data loop by helping users turn insights into action. And by embedding those actionable insights directly into the application, we are

  • Eliminating any bias or irrelevant details (“noise”)
  • Making complex machine learning models — the ones that were formerly reserved for analytical elites — available across the organization so that everyone can put data to work for them
  • Making incremental improvements to decision-making processes that can lead to substantial bottom-line benefits over time

Download the Nucleus Research GoodData Guidebook

Categories: Companies

Nucleus Research releases GoodData Guidebook

Good Data - 4 hours 19 min ago

In today’s crowded market, the measure of success for embedded analytics is no longer limited to analyzing data to improve internal operational efficiencies. Now, in order for organizations to extract the maximum benefit from their data, they should look to monetize it.

So says Nucleus research in their recently released GoodData Guidebook, which explores how different companies use GoodData’s analytics platform and services to monetize their data assets and extract latent value from their data. In fact, Nucleus found that embedding GoodData’s analytics into their offerings was a significant differentiator that played a critical role in closing 80 percent of partner sales.

For their report Nucleus interviewed multiple companies that had sought out vendors for advanced embedded analytics solutions. This report identifies the key reasons these companies chose GoodData, including:

Getting Analytics Where You Need Them

Nucleus found that a major requirement for the new generation of analytics tools is the ability to embed insights at the point of work. Previously, analytics were treated as separate, standalone systems. This created adoption problems, as business users who aren’t primarily responsible for data analysis will be reluctant to add those tasks to their workload. The solution, according to Nucleus, is to embed this functionality into other core enterprise applications. By doing so, analytics vendors can offer in-context insights to users, who can easily incorporate that analysis into their daily workflows to make more informed choices and take smarter actions every day.

Harnessing the Advantages of the Cloud

Cloud-based options were a top priority for users and partners when evaluating analytics offerings. Previous research from Nucleus found that cloud-based solutions can offer 2.3X ROI compared to on-premise solutions. This is because cloud offerings reduce hardware and software costs, lower consulting costs, and are faster to implement and easier to upgrade.

Direct Revenue Generation and Reduced Costs

In their report, Nucleus found that companies who partnered with GoodData were able to generate net-new revenue opportunities with externally facing data applications. Offering these data products played a critical role in closing 80 percent of sales while shortening the sales cycle, boosting retention, and creating an average of 25 percent increase in upsell after the close.

On the other side of the coin, GoodData’s platform allowed these same companies to dramatically reduce costs on data management, analysis, and distribution as well as other areas.

The GoodData Difference

The analytics market continues to expand at a prodigious rate. But according to Nucleus, the focus on data monetization and a customer-first mentality set GoodData apart.

“Nucleus expects that the culmination of GoodData’s ability to monetize data assets for its partners, along with its ongoing commitment to customers, will allow the company to experience long-term growth and stand out in the field.”

For more insights on the benefits of cloud-based analytics solutions, you can download the full report from Nucleus Research here.

Nucleus Research: GoodData Guidebook

Partners reported that they were able to use the GoodData platform and services to monetize their data assets, which created a new source of revenue and allowed them to extract the latent value of their data.

Download Guide
Categories: Companies

How AI Will End Offshoring

Good Data - 4 hours 19 min ago

We are living in the Rise of the Machines. No, Skynet and its hordes of Terminators haven’t materialized just yet, but there are many that fear that advances in automation and Artificial Intelligence are a serious threat to humanity in at least one aspect: jobs. In America, more jobs are already lost to robots and automation than to China, Mexico or any other country. But the global impact of the these systems will be felt even more strongly. I believe that the proliferation of ‘virtual talent’ will have a profound effect on intellectual offshoring and business process outsourcing, one that will be especially pronounced for emerging countries.

Traditionally, outsourcing and offshoring has primarily been conducted as a cost reduction measure. After all, why spend more on an expensive local worker sitting at a computer when the same tasks can be performed at a dramatically lower cost by an overseas worker sitting at the same computer, all while maintaining the same level of quality? In the pre-AI world, that thinking made perfect sense. But what if you could make the computer do that same job, without a human operator? The cost savings would be massive, and the business decision obvious.

Advances in AI technology are rapidly making this hypothetical a reality. Recent research from Gartner found that by 2018, 40% of outsourced services will leverage smart machines, “rendering the offshore model obsolete for competitive advantage.” And this market is only going to grow. The same report states that over $10 billion worth of these systems have already been purchased from the more than 2,500 companies providing this technology, while more than 35 major service providers are investing in the development of "virtual labor" options that leverage AI-driven service offerings.

All of this means that the intellectual offshoring we've seen since the 90s will no longer be needed or even viable, as there won’t be any business requirement for these services or economic incentive to move these tasks overseas. AI and advanced analytics allow for the automation of many tasks that are currently outsourced. That’s an extremely attractive option; automating tasks that are currently performed by hundreds of overseas employees will enable businesses to hire more expensive local talent who can focus on the difficult tasks and strategic decisions that make bigger business impacts.

AI not only softens the incentive of cheap foreign labor, it also negates the advantages of lower offshore operational costs. If you can locate the machines that are performing these tasks anywhere on earth for basically the same cost, why not keep them close to your home base of operations and save a bundle on travel, audit and compliance costs?

These shifts might take a few years as technology develops, but they are coming, and they will fundamentally change the way the world does business. Intelligent machines are here, and companies that continue to rely solely on outdated offshoring models out of fear of the risks and challenges of being early adopters of these systems do so at their peril. Virtual labor technology can offer potential cost savings of between %10-%70, so business leaders must begin planning now for how to adopt this technology and adapt their organizations to maximize its potential in order to survive and remain competitive.

Categories: Companies

Winning the Embedded Analytics Go-to-Market Race: GoodData at SaaStr Annual 2017

Good Data - 4 hours 19 min ago

With demand for their products on the rise, SaaS leaders find themselves faced with an extraordinary opportunity … and an equally intense challenge.

The opportunity is a booming market. Forrester reports that fully ⅓ of enterprises complement their existing offerings with SaaS, with analytics and intelligence solutions taking center stage. Of course, with expanded opportunity comes the challenge of expanded competition — especially in a field with a low barrier to entry — and the need to differentiate in a crowded field that seems to grow on a daily basis.

The good news is that SaaS brands already hold a vital key to differentiation: your data. Embedding analytics in your SaaS solution can help you drive serious results quickly like MediGain, who achieved 1044% ROI on their analytic investments with a payback period of only a little more than a month. Others, like ServiceChannel, leveraged embedded analytics to increase engagement from 30% to an incredible 80%, while Demandbase helped some of their customers boost conversion by 200% by integrating GoodData’s analytics into their Performance Manager SaaS product.

And as with most competitive advantages, your embedded analytics product must get to market quickly, ahead of your competitors, if it’s going to succeed.

This speed-to-market is the theme behind GoodData’s participation in this year’s SaaStr Annual conference and trade show. Every year, SaaStr brings together the top professionals and executives from across the SaaS community to learn from the industry’s most prominent thought leaders. This year’s event, coming up on February 7-9 in San Francisco, is expected to draw 10,000 attendees for networking, learning and fun.

The GoodData team will be on hand at Booth 4 with live demos and expert advice on getting to market faster with your embedded analytics product. Schedule a time to meet with us here, and let’s talk through your biggest questions around executing a successful product launch.

SaaStr Annual attendees will also be treated to a presentation from GoodData partner Zendesk: co-founder and CEO Mikkel Svane will take the stage to share the story of his brand’s rise to become the dominant SaaS provider in the space. Be sure to check out Mikkel’s presentation, “Zendesk: From Day 0 to Today: The Lessons Learned” on Thursday (2/9) at 9:30 am.

We’ll see you at SaaStr Annual, and remember to book your meeting with our team while spaces are still available!

Categories: Companies

MediGain Brings Automation and Analytics to New Heights with GoodData

Good Data - 4 hours 19 min ago

MediGain had an absolutely incredible 2016. They received a 2016 Technology ROI Award from Nucleus Research for their implementation with GoodData’s analytics platform, a deployment that ultimately generated a 1044 percent return on investment for the company. And according to a recent article in American Healthcare Leader, MediGain’s CIO Ian Maurer has now made it his mission to take the company to a whole new level as far as automation, analytics and security are concerned.

While many of Maurer’s initiatives focus on data security, a key area of innovation has been MediGain’s analytics capability. When he first joined MediGain, their Business Intelligence team was stuck creating month-end reports for each client with an old system that was very slow and required massive amounts of manual data work.

“It was a very manual process,” Maurer said. “It required extracting data from client systems and converting it to Excel format, and then going in and scrubbing that data, creating pivot tables and all this manual data manipulation to create a report to send out as early in the month as possible.”

As the company was experiencing rapid growth following a series of acquisitions, a quantum leap in analysis efficiency over their legacy on-premise analysis system was needed in order to automate these procedures and meet the increased demand.

“I decided on a platform as a service solution with GoodData, and using that tool I was able to create practice management vendor specific automated extracts,” said Maurer. “Once per day, we pull data from our client practice management systems and aggregate and normalize it all so it’s standardized from client to client, and we could look at it in an aggregated format and see all the clients rolled up, and see how we are doing as a company, and see their financial performance over time.”

Partnering with GoodData revolutionized MediGain’s analytics offerings. Their new analytics platform eliminated the errors that had plagued the old system, while freeing up time and resources for their BI team to focus on interpreting the data and managing other projects rather than running reports all the time. In fact, Nucleus Research found that GoodData’s platform not only increased the efficiency and efficacy of MediGain’s BI staff, but also enabled them to offer new client-facing packaged analytics services for financial analysis. This implementation has enhanced MediGain’s value proposition, attracting new clients and boosting their revenue.

Learn more from Nucleus Research about how MediGain drove 1044% ROI with advanced analytics

Categories: Companies

The Three Secrets to Successfully Monetizing IoT Data

Good Data - 4 hours 19 min ago

The global business landscape is being transformed by the Internet of Things. In the emerging Connected Economy every business will have to become an IoT business, and enterprises that don’t adapt, innovate, and transform their models will risk falling behind. However, the reality is that IoT monetization is going to be difficult, especially IoT data monetization. In fact, I believe that more that 80 percent of companies that begin IoT implementations will squander their transformational opportunities.

For any company that has customers, suppliers, employees or assets, IoT monetization can be transformational. But success in IoT monetization comes down to more than just technology; it requires careful strategic planning and shifting your entire corporate mindset to treating data as the valuable asset that it is. To learn more, I encourage you to check out the article I wrote for BetaNews, where I detail my top three strategies for ensuring that your IoT initiative is part of the 20 percent that actually succeed.

Oh, and one last thing; as 2016 draws to a close, I want to take this opportunity to wish you Happy Holidays and a fantastic 2017 from all of us here at GoodData!

- Roman Stanek

Click here to read Roman’s ‘Three Secrets to Successfully Monetizing IoT Data” in BetaNews

Want to learn more? Click here to download Gartner’s report Prepare to Monetize Data From the Internet of Things

Categories: Companies

Profiting From Actionable Analytics

Good Data - 4 hours 19 min ago

On Thursday, December 15th, I had the pleasure of co-presenting with Martin Butler a webinar titled "Profiting From Actionable Analytics”, where he and I explored how enterprises can leverage their investments in data (whether it’s big, transactional, or IoT data) and drive profitability via actionable analytics. Martin looked at industry trends and thought leaders to uncover best practices, while I focused on how the GoodData platform leverages our Cognitive Computing, Distribution, Connected Insights and Analytics services to deliver Smart Business Applications that drive analytic-driven action, whether it's an automated, recommended, or in-context business decision.

I invite you to check out the recording of the webinar.

Categories: Companies

What Fighter Jets Can Teach Us About BI Modernization

Good Data - 4 hours 19 min ago

Business is War. And just like today’s battlefields, the modern business landscape is more crowded, competitive, and lethal than ever. In both arenas, it’s no longer just survival of the fittest; it’s survival of the smartest. Success depends on using every ounce of information and intelligence at your disposal to make smarter decisions and take actions faster than your opponent. So what can a next generation fighter jet teach us about BI Modernization? As it turns out, a lot. Each relies on advanced, intelligent data management to deliver unprecedented levels of efficiency and effectiveness. And both have the potential to completely revolutionize the way things are done.

Military aviation is a lifelong passion of mine, so it’s been with great interest that I’ve followed the development of the Lockheed Martin F-35 Joint Strike Fighter. And while that program has seen its fair share of challenges, the F-35 packs some undeniably revolutionary technology that gives it critical advantages over every other fighter on earth. The F-35 isn’t revolutionary because it flies faster or maneuvers better than previous generations of fighters (side note: it doesn’t.) It’s revolutionary because it completely shifts the paradigm of how it does its job, to the point where it simply doesn’t have to compete against legacy platforms on their same terms. The F-35 achieves this next-generation performance through a concept called Sensor Fusion, where information from multiple sensors is combined with advanced cognitive processing to allow F-35 pilots to see a clear picture of the entire battlefield environment in real time. In short, the F-35’s technology gives its pilots the capability to detect a threat, orient their plane to the optimal tactical position, and then decide to engage or avoid that threat before their opponents even realize they’re there.

So what does this have to do with BI Modernization? In business, as in aerial combat, mission planning and intelligence is as vital to success as execution. The faster you can paint a picture of your operational environment, identify obstacles to success, and take action to overcome them to achieve your objectives, the better. The F-35 paints this picture with sensor inputs while BI tools rely on various data sources, but in both situations it’s not the data inputs themselves but how that information is processed that makes for new levels of performance.

Rather than having a complex cockpit crammed with disparate displays from each sensor (which places an immense workload on the pilot who has to process, analyze, and prioritize this data) the F-35 integrates all this sensor data into one location and then uses its advanced processors to prioritize threats and fill in any missing gaps so the pilot is presented with a clear, actionable course of action. This is the same goal of BI Modernization. While there are a plethora of BI solutions that provide analytics and reporting, most are not designed to cope with new formats and higher volumes of data. Simply put, the amount and variety of data available to business users is outpacing their ability to distill it into actionable insights. What is needed is a “BI F-35;” a solution that replaces the overly complex BI “cockpit” of the past with a platform that leverages advances in cognitive processing to deliver the ability to take data from myriad sources and distill it into clear insights that inform tangible actions.

Categories: Companies