Skip to content

Feed aggregator

Usability and Functionality Still Reign in Analytics: Nucleus Research

Good Data - 4 hours 17 min ago

Once dominated by a handful of providers, the analytics market has become a crowded field in the last few years. But despite the diversity of solutions now available to them, for many users the task of evaluating their options often leads to more confusion than clarity.

However, as Nucleus Research reports in their 2016 Analytics Technology Value Matrix, customers still determine a provider’s value according to two simple criteria: usability and functionality. They want their solution to work well, and they want it to be easy to use.

Based on these two determinants, the Nucleus team created a “value matrix” classifying the market’s key providers into four categories:

  • Leaders: High usability, high functionality
  • Experts: Low usability, high functionality
  • Facilitators: High usability, low functionality
  • Core Providers: Low usability, low functionality

In evaluating each vendor, Nucleus looked at three critical components: embedded analytics capabilities, cloud deployments, and AI/machine learning.

GoodData as Leader

I’m pleased to report that in Nucleus’ evaluation, GoodData ranks among the Leaders (high usability, high functionality):

GoodData continues to be a leader in the analytics market, differentiating themselves by helping customers turn their data into a profit center. The company provides customers with highly usable analytics solutions, as well as a platform and services for organizations looking to monetize their data assets.

For a real-world case study, the Nucleus team reviewed MediGain’s deployment of the platform and confirmed that “GoodData played a key role throughout the entire go-to-market process in helping MediGain capitalize on its data.”

In conclusion, the team predicts that GoodData’s unique capabilities will fuel further growth in visibility and popularity:

GoodData can provide the analytics development and tools along with a Service Data Product team to build a complete GTM framework including market research, product design, pricing and packaging, and product marketing activities for those organizations that want to provide a data product quickly and may not have the internal resources available.

For more details about Nucleus Research’s analysis and criteria, I invite you to download the white paper here.

Categories: Companies

Mercatus: Driving Revenue Opportunities for Utility Companies

Good Data - 4 hours 17 min ago

When a power company considers a major new investment, a huge amount of data must be evaluated … and Excel spreadsheets just aren’t up to the task. Mercatus offers a better way, by providing a system that houses all this information and allows stakeholders to easily review the asset’s entire portfolio.

When Mercatus’ customers began requesting distributed analytics, Senior Director of Product Management Cathi Grossi and her team saw an opportunity. Creating a data product would enable them not only to deliver greater value and create upsell opportunities, but also to compile anonymous customer data for benchmarking. After briefly considering the “build or buy” decision, they decided to partner with GoodData. (Read the full case study here.)

“Building a distributed analytics solution is not the business we want to be in,” says Grossi. “We didn’t have the expertise needed, and there was no reason to reinvent the wheel.”

Thanks to Mercatus’ Pipeline Analytics product, customers can now see exactly where every project is in their pipeline, which helps them make more informed decisions, address issues, and become aware of potential disruptions. The analytics have become an integral part of the selling process, in addition to offering a unique upsell opportunity.

“Everyone wants it,” says Grossi. “We’re reinforcing the fact that when you buy Mercatus, you’re buying a business intelligence platform for the utilities industry.”

To learn more, read the Mercatus case study, “The Power of Project Insights.

Categories: Companies

Actionable Analytics – In Search of the Holy Grail

Good Data - 4 hours 17 min ago

So we load up our shiny new visual analytics platform and start slicing and dicing data, creating charts and graphs using the latest designer color schemes, and we might even add a dash of animation. And because we now have so many sources of data, the combinations of dimensions and measures is pretty well endless. As a result we start to see correlations and trends where none had been visible before. These ‘insights’ come thick and fast. Only there are two issues that spoil the party. Are those ‘insights’ really insights, and what do we do with them?

To answer the first question. It has long been documented that human beings invariably add their biases when analyzing data. If we want to find a trend, sure enough we will find one. Whether it is really a trend depends on a deep understanding of the business, and whether that rising line is in any way justified. If not, then it is probably nothing more than noise dressed up to look pretty. So we have to be very careful when digging around in data. We should understand that correlation is not causation, and that many of the ‘insights’ we gain might be nothing more than randomness made to look respectable.

The second question – what do we do with the insights – is complicated. Let’s examine what might need to happen to make an insight actionable:

  • Business managers must be convinced that the insight is real and can be trusted. It becomes necessary to create a story, presenting all the evidence, with explanations of why the various charts and graphs are the way they are. This might seem like it should be a straightforward process, but let’s remember that change usually involves treading on someone’s toes. So there will be resistance from those who will need to implement change, and detailed scrutiny of what is being presented, with ruthless exploitation of any holes in the evidence being presented. Think carefully before you poke a stick in the hornet’s nest.
  • If there is broad agreement that the analysis is correct, the next step is to decide how business processes need to change. The change might be trivial, or it might go to the heart of several business processes. If it is the latter then it may take months to redesign processes and get agreement from all involved.
  • Once implemented, the changes need to be monitored, and so new reporting and analysis procedures need to be put in place. Fine tuning might need to be made, or heavens forbid, we might find that matters have not improved in the way we expected. New analysis, with new ‘insights’ might suggest additional changes. It might be best to keep quiet about that one.

It should be clear from this that analysis is really just a small part of the insight-to-action journey. While suppliers of analytics tools flatter business users that their insights are important, the reality is somewhat different. Most insights will not be insights at all, but random noise making itself look respectable. Those insights that have been interrogated and found to be true mean nothing unless business processes can be changed, and this is where the real hard work is to be found.

Before we obsess over charts and graphs we should ensure that the results of analysis can be implemented in a controlled, documented, measurable manner. To this end standards such as Decision Model and Notation are being introduced, as a way to integrate decision logic into business processes. After all, making the results of analysis actionable generally means that people make different operational decisions.

Right now we are fascinated with the shiny new things – just because they are new and visually attractive. The real work consists of having people who know when randomness is trying to pull its tricks, so that analysis actually means something, and having the processes in place to transform significant insights into action. Anything else is just a waste of time.

Within half a decade we will have AI powered BI, and so the task of finding meaningful insights in our data will become easier. We will also have the methods and infrastructure to quickly move from meaningful analysis to modified business processes. Until then we need to be cautious and expect that actionable analytics will become more of a reality as the whole process becomes automated.

"This article was originally published by Butler Analytics"

Categories: Companies

Data-Driven BI Market Expected to Reach $26.5 Billion by 2021

Good Data - 4 hours 17 min ago

According to a recent article in EconoTimes, a new report by Zion Research shows that the data-driven global business intelligence (BI) market accounted for USD 16.33 billion in 2015 — and that it could reach USD 26.50 billion by the year 2021.

“Business intelligence helps companies make better decisions,” the article states. “[S]oftware will improve the visibility of processes and make it possible to identify any areas that need development.”

The Zion report goes on to identify the largest market for BI growth as North America, which accounted for 86% of the global market share in 2015, and lists GoodData among the major industry participants in this worldwide trend.

Here at GoodData, we were very interested by this report for obvious reasons, but we were also thrilled to see that several of our key strategies are being validated by market trends. We believe that true Business Intelligence is about more than dashboards; it’s about making actionable, data-driven recommendations that inform actions, and making sure those recommendations are embedded seamlessly into workflows to maximize their effectiveness.

The report supports this strategy, noting that BI is no longer tied to desktop computers: mobile business intelligence (MBI) now accounts for more than 20 percent of the global business intelligence market. Not only do mobile BI apps facilitate the viewing of data for an increasingly mobile workforce, but they also allow data captured via mobile devices to be incorporated instantly so that reports can be updated in real time.

Zion’s research also found that while on-premise analytics still account for 86% of the market share, this number “is projected to witness a decline in its shares by the end of the forecast period.” As a cloud-based analytics company, we have long expounded on the benefits of moving BI and data warehousing environments to the cloud. Among other benefits doing so speeds deployments, avoids capital expenditures on hardware infrastructure, simplifies software upgrades, and minimizes the need for IT involvement, and we’re very happy to see the increased speed of adoption for cloud solutions in the BI marketplace.

For more details, you can read the full article here.

To learn more about how to define and measure your data’s usefulness and monetary value, download the Eckerson Group report on Data Monetization Networks.

Categories: Companies

6 Tips to Hosting a Successful Customer Advisory Board

Good Data - 4 hours 17 min ago

As we enter the holiday season and people start talking about their favorite time of the year, I also like to reflect on my favorite work time of the year - GoodData’s biannual Customer Advisory Board. It’s so energizing to get a small group of thought leaders and visionaries into a room for 2 days to show off what they’ve built, talk about their plans for the future and discuss how in partnering together we can see those ideas to fruition.

I’ve overseen Customer Advisory Boards for the past 15 years with various companies I’ve worked with, and they’ve always been a highlight. This year’s CAB was no different, so without further ado and here are some of the key lessons I’ve learned.

  1. The product team owns the content. Everyone, and I mean everyone, wants to attend a CAB and have a speaking slot because it’s not often that multiple customers are in one room for 2 days at the same time. And really important information is shared. AND it’s also a chance to meet customers face to face. AND it’s a LOT of fun. But, other groups have a regular cadence of discussion with customers and this is the product team’s opportunity to hone in on the product roadmap. Customer’s want to share their visions and goals and influence how the product can help them meet those goals. I can’t tell you how many times we’ve shared a roadmap only to learn that the one ‘big’ thing we thought customers wanted was a lower priority than we thought. Prioritization exercises are great learning experiences.
  2. Have specific goals and agenda that will help you meet those goals. It’s easy to fall into the trap of wanting to get every question answered and show off what you’re building and planning. In order to have an effective CAB you need to focus. After the CAB you’ll have plenty of opportunities to continue the conversations now that a relationship has been established.
  3. Keep it small. The urge to invite every ‘important’ or ‘big’ customer to join is one that needs to be curtailed. Every customer is important but that doesn’t mean that all customers are comfortable sharing in groups, or will have interest in the particular product features you will be discussing. Smaller allows for more in depth discussion and participation. There is nothing worse than having members that don’t participate. You don’t want to waste anyone’s time.
  4. Invite customers based on your goals - not based on name, size or previous participation. This goes back to number 3. Every CAB has a different focus or goal and that might mean that some people who have participated in the past won’t for this particular meeting. That doesn’t mean they aren’t important anymore, it means that their use case isn’t one you will be discussing this time and it wouldn’t be a good use of their time. It’s hard to not see the same faces as you’ve built a rapport but hopefully you are still engaging them in feedback.
  5. Allow enough time for networking and sharing of ideas. It always surprises me how much time our customers want to spend seeing what others have built and sharing ideas among each other.
  6. Don’t talk - LISTEN. This is the HARDEST lesson of all but the most important. You are a facilitator of conversation and sharing of ideas. Customers do not want to be talked at - they could listen in to a webinar if that’s what they wanted. This is their opportunity to share their ideas with you. It’s important that you hear them and not feel the need to always respond. I’d also encourage you to consider inviting customers who may not currently provide a glowing reference. Some of the best ideas come from brainstorming together with a less than happy customer. And what better way to get them re-invigorated about the relationship than to have worked together to find solutions to their visions?

Frankly, I could go on and on about the benefits and joys of hosting Customer Advisory Boards. So I’ll end with a great big thank you to our current and past CAB members. Without them, we wouldn’t have built the amazing company that we have.

Categories: Companies

How Fourth Drove 117% ROI with Data Products

Good Data - 4 hours 17 min ago

For hospitality managers, cost control is a constant juggling act, involving inventory management, personnel, pricing, and a host of other factors. They have the data they need to make informed decisions; they just don’t have the time or resources to analyze and distribute it effectively, and most analytics packages are beyond their budgets.

Enter Fourth, the world’s leading provider of cloud-based cost control solutions to the hospitality industry. The company saw an opportunity to offer an analytics platform alongside its existing purchase-to-pay, inventory management, and other solutions.

“Our mission was to create a solution that everyone could afford and could take advantage of,” says Mike Shipley, who coordinated Fourth’s project with GoodData. “We wanted to empower our customers to run their businesses better — to make the information easy to access, and to offer it in a graphic, interactive interface that they can use everywhere.” (Read the full case study here)

In 2013, Fourth and GoodData partnered to design and launch a data product that was unlike anything else in the hospitality marketplace. The new platform allows customers not only to improve business performance by enabling decisions based on actual data, but also to automate distribution of the right data to the right stakeholders in an easy-to-read format.

One Fourth Analytics customer, GLH Hotels, was able to identify an opportunity to increase its margins simply by changing its staffing profile. “Now we have a greater percentage of employees on part time and flexible contracts, with the obvious effects on the bottom line,” says GLH’s HR Programme Manager Andrew Elvin.

Since Fourth Analytics went live, the product has delivered — and continues to deliver — a host of benefits to the organization, including

  • New revenue opportunities
  • Added value to current customers
  • Faster deployments
  • Thought leadership
  • ROI of 117 percent

To learn more, read How Fourth Added Analytics to Its Menu…and Realized a 117% ROI.

Categories: Companies

Turning Data Into Revenue-Generating Products: Four Steps to Success

Good Data - 4 hours 17 min ago

Over the next few years, the market for data-driven products and services is expected to grow by 26.4 percent — six times the rate of the tech sector as a whole — according to IDC Research.

Driving this growth is the realization that data products can create not only a new source of revenue, but also a unique competitive advantage. However, many companies become stuck on the “how” — how to create the product and launch with an effective go-to-market strategy.

This “how” is the subject of an article I recently contributed to Data Informed, “4 Steps to Designing and Launching an Effective Data Product.” If its data product is to succeed, the organization must Determine goals and develop a user-led product design

  1. Establish pricing and packaging
  2. Roll out a successful launch
  3. Continue to add new features and functions

For a detailed description of each step — and a real world case study — I invite you to read the full article here.

To learn more about choosing the right embedded analytics solution for your organization, download the report Which Embedded Analytics Product is Right for You?

Categories: Companies

How The Election Shined a Spotlight on the Limitations of Predictive Analytics Technology

Good Data - 4 hours 17 min ago

The results of this week’s presidential election came as a shock to many people. But as I sat glued to CNN, watching as more and more states flipped from Blue to Red, all I could think of was how this election stands as a stark reminder of the risks that come with predictive analysis technology.

Politics, just like every other industry, has become enamored with the power of big data. And as political analysts crunched the numbers and ran predictive data models over the past months a Hillary Clinton victory seemed almost assured, with the major vote forecasters placing her chances of winning between 70 and 99 percent. Of course we all now know how wrong those predictions were.

The promise of predictive analytics is an intoxicating one, and it’s easy to see why. Traditional analytics technology is primarily retrospective, but radical advances in AI and computing power have promised to shift this paradigm to a forward looking one. Using machine learning and predictive analysis, it’s now possible to come up with reasonably accurate predictions about what will happen in the future. But this new wave of technology comes with heightened stakes and greater risks, and the near total failure by political experts to foresee Donald Trump’s victory perfectly illustrates the pitfalls of flawed assumptions, wide margins of error and lack of context when it comes to predictive analysis.

The ability to use technology to gaze into huge amounts of data to discern upcoming trends and inform future actions can be a massive advantage, and I do believe that AI will transform the business landscape especially as far as business intelligence and predictive analytics are concerned. But this technology is young and still developing, and predictive algorithms are not magic. They rely on data models that can be based on flawed assumptions, and if those models are wrong the results can be disastrous. Just look at the event known as Knightmare, where the Knight Capital Group lost $440 million in 30 minutes when their trading algorithm went haywire.

My point is that predictive analytics, while potentially game-changing, have limitations. And as professionals from every industry become ever more reliant on these tools to make increasingly important decisions, it’s more essential than ever to appreciate and account for these limitations. Predictive technology gives probabilities, not guarantees. That is why there will always be a need for experienced, insightful experts to plug the gaps left by this type of technology. Only by applying years of experience and expertise can one take those probabilities and apply the intuition, creative thinking and context needed to make truly informed decisions. As Steve Lohr wrote in a recent New York Times article, “data is the fuel, and algorithms borrowed from the tool kit of artificial intelligence, notably machine learning, are the engine.”

But we still need to be in the driver’s seat, remembering to never take our eyes off the road.

Categories: Companies

Thoughts From the Fast Casual Summit: Data, Analytics and Innovation in the Restaurant Industry

Good Data - 4 hours 17 min ago

Last month I had the opportunity to attend the Fast Casual Executive Summit in Orange County, where I led an interactive “Brain Exchange” session about “Analytics as Assets.” This discussion was all about how QSRs are leveraging analytics to increase their trust with operators and automate key workflows, allowing them to dramatically improve margins. After the event, I sat down with Shelly Whitehead from Fast Casual for a Q&A session about some of the insights that were shared. Here are a few of the highlights:

  • Analytics aren’t just beneficial for large brands. With respect to sharing data and analytics, some of the participants from larger organizations were surprised learn that the smaller brands were just as passionate and dedicated to innovation as they were.
  • Correlation is key. Simply measuring and communicating metrics is not enough; there is an urgent need to understand why metrics move. By understanding cause and effect, restaurateurs can take better informed actions and create opportunities to innovate (e.g. If restaurateurs understand how a social media marketing campaign impacts demand and traffic, they can leverage data to drive smart menus and automate scheduling.)
  • Think about your servers as end users. Most of the time, we think of the end user in terms of management, suppliers, partners, and those in HQ. But our discussion revealed lots of opportunities to deliver information to servers to help them optimize and maximize their individual tickets, which not only helps them individually but also drives success for the restaurant itself.

These are just a few of the insights from our full discussion. Restaurants today face serious challenges, both internally in the form of large increases in operating costs and externally from emerging business models from companies like Blue Apron, Munchery and others. However there are also massive opportunities for innovation in the restaurant sector, and the leaders I spoke with all commented on the value, business impact, and benefits of distributing analytics can have on their organizations.

To learn more about the possibilities of distributed analytics for restaurants, download How Big Data is Revolutionizing the Restaurant Industry:

Categories: Companies

80% of Companies Will Fail to Monetize IoT Data, According to Gartner

Good Data - 4 hours 17 min ago

It seems that lately, I can’t login to Twitter or open LinkedIn without being bombarded by articles about the coming wave of transformative business opportunities that will be created by of the Internet of Things. They herald IoT as the new frontier of economic growth, with Cisco estimating that connected products have the potential to generate more than $19 trillion of value in the coming years. But even as the promise of IoT has ensnared the minds of the business community, many have ignored or downplayed the massive elephant hiding behind the mountain of gold: that actually monetizing IoT is going to be hard. So hard, in fact, that Gartner estimates that through 2018, 80% of IoT implementations will squander their transformational opportunities.

I’ll give you an example: telematics have the potential to completely transform the insurance industry. But right now, most companies are focused on the data plumbing, rather than putting that data to work making business impacts. In short, the vast majority of the data being generated by connected devices remains undervalued and unmonetized.

There are exceptions. Automakers are leading the way in consumer facing IoT monetization, and in the B2B space Nest is a powerhouse with their Learning Thermostat product which enables them to offer energy management services to utilities. But as Gartner notes, the trend towards treating and deploying data as a monetizable asset is still in the “early adoption” phase.

I believe this shortfall is largely attributable to the fact that I’ve seen very few business leaders that have laid out concrete plans to turn the data being generated by our “things” into actual revenue. Capgemini Consulting notes that although 96% of senior business leaders have said their companies would be leveraging IoT in some way within the next three years, over 70% of organizations do not generate service revenues from their IoT solutions.

This is a huge issue, as to remain competitive and thrive in the digitally connected economy enterprises will need to find ways to monetize IoT data and harness it to enhance performance and create new business value. The problem is that most organizations are poorly equipped when it comes to the knowledge or experience needed to capitalize on monetizing their information assets. But as Albert Einstein famously said, “In the middle of difficulty lies opportunity”; and monetizing data from IoT can be a major competitive advantage for the truly innovative organizations that are doing it successfully.

A key piece of solving the IoT data monetization puzzle will be choosing the right technologies and partners to help them achieve their goals. The flood of data from the IoT will be wasted unless organizations can analyze and package it in a valuable way, and Gartner states that most opportunities for monetizing IoT data will involve some level of refinement and analytics. I believe that embedded Smart Business Applications that leverage artificial intelligence will be an integral part of the solution, as they provide the vital layer of “connective tissue” that can turn the raw data generated by IoT devices into actionable insights that drive value.

AI will transform the business side of IoT, with applications for every industry from automotive to healthcare and insurance. IBM Watson, Google Now, Alexa, Siri and other platforms have opened people’s eyes to the possibilities of AI, and business users want that technology applied to the tools that make them better at their jobs. Truly “smart” business applications will harness the power of AI by taking data from a myriad of connected sources and applying predictive and statistical analysis to that data to ultimately predict outcomes and deliver powerful insights that recommend and inform actions to business users, unlocking monetization possibilities.

Categories: Companies

'Looking for Leia' film shines light on Star Wars fangirls - CNET

Annalise Ophelian talks about her crowdfunded documentary, which asks women inspired by Star Wars about their love for a galaxy far, far away.
Categories: Blogs

Tech investor accused of sexual harassment takes leave - CNET

The Wisdom of Clouds - James Urquhart - Sat, 06/24/2017 - 01:16
Justin Caldbeck, co-founder of Silicon Valley VC firm Binary Capital, apologizes to those he hurt and "to the greater tech ecosystem."
Categories: Blogs

Is Uber 2.0 now possible after CEO’s departure? - CNET

The Wisdom of Clouds - James Urquhart - Sat, 06/24/2017 - 00:32
Uber may get a fresh start after Travis Kalanick's resignation this week. But any new CEO needs to own up to the company's past -- and watch out for the old one's continued involvement.
Categories: Blogs

20 of the best 'Game of Thrones' fan theories - CNET

The Wisdom of Clouds - James Urquhart - Fri, 06/23/2017 - 23:46
Who's going to kill whom? Who's already dead? And who will ride the dragons?
Categories: Blogs

Instagram 'Favorites' list lets you control who sees your pics - CNET

The Wisdom of Clouds - James Urquhart - Fri, 06/23/2017 - 23:23
Not everything on Instagram is worth sharing to the world.
Categories: Blogs

Tweeters riled by CEOs 'experiencing' homelessness via VR - CNET

The Wisdom of Clouds - James Urquhart - Fri, 06/23/2017 - 23:01
Commentary: A video of business people donning VR headsets posted to Twitter by a homelessness charity in Australia makes many think of a painful dystopian future.
Categories: Blogs

Russians make a fidget spinner out of... cars? - CNET

The Wisdom of Clouds - James Urquhart - Fri, 06/23/2017 - 21:54
This trend has officially gone too far.
Categories: Blogs

New Apple Store gets a MacBook roof - CNET

The Wisdom of Clouds - James Urquhart - Fri, 06/23/2017 - 21:31
The roof of Chicago’s new Apple Store has been revealed. The real question is: Is it an Air or a Pro?
Categories: Blogs

Samsung Galaxy Note 8 said to cost $900, launch in September - CNET

The Wisdom of Clouds - James Urquhart - Fri, 06/23/2017 - 21:20
If rumors are true, the Note 8 will be Samsung's most expensive Galaxy phone to date.
Categories: Blogs

The 3:59 extended edition: Are you an iPad Pro or Surface Pro? - CNET

The Wisdom of Clouds - James Urquhart - Fri, 06/23/2017 - 19:48
Here's the full podcast and YouTube post-show from one of our favorite episodes this week.
Categories: Blogs