Podcast

Tapping the Enormous Value of Historical Data

InsideAnalysis Podcast

You must consent to Statistics cookies to watch this video.

Speakers

Eric Kavanagh

CEO @ The Bloor Group

LinkedIn

Joe Gaska

CEO @ GRAX

LinkedIn

About this talk

Those who don't study history are doomed to repeat it. That's especially true in the digital age, as historical data piles up everywhere, and yet so few companies are positioned to use it. The ideal solution today should enable rapid access to disparate SaaS application data so that businesses can get a strategic view of what's really happening. That's almost always the function of understanding the trajectory of historical data.

Watch this episode of InsideAnalysis to hear host Eric Kavanagh explain why historical data holds the key to digital transformation. He's joined by Joe Gaska of GRAX, who will share his perspective on the critical role that historical data must play in optimizing business outcomes. They'll also discuss the reality on the ground for today's innovators, and why SaaS data holds the key to the next generation of cloud-enabled business models.

56 min. Published on

Transcript

[MUSIC PLAYING] [MUSIC-BLACK BANANAS, "RAD TIMES"]

VOICEOVER: The information economy has arrived. The world is teeming with innovation, as new business models reinvent every industry. Inside Analysis is your source of information and insight about how to make the most of this exciting new era. Learn more at insideanalysis.com.

And now here's your host, Eric Kavanagh

If you're so down, you'll go bad in your mind

Welcome to the future times

It's a positive explosion, so slow the motion

We have all been chosen to be here

- All right, ladies and gentlemen. Ladies and gentlemen, hello and welcome back to the storm plagued radio show that is broadcasting coast to coast here on Inside Analysis. And we just had a big ol' storm come through folks, so that's why we're sounding a little bit funny right now. But hopefully, Joe Gaska can hear me from the GRAX and we're going to talk a bit about storage today. We're going to talk about historical data in particular. And I want to bring in Joe to just give us kind of an intro about what he's doing, and I'm going to try to get back downstairs to my studio, but auxiliary internet connectivity. But, Joe, tell us a bit about historical data and what you're working on these days at GRAX?

- Great, Thanks Eric, and we'll try to entertain the crew as you're trying to deal with the technical difficulties in your dial-up internet. So, let's get in and talk about this.

ERIC KAVANAGH: Yeah, But I need to know. I need all of it.

- So, as Eric deals with these technical things, let's jump in and talk about historical data and historical significance, and how GRAX got here and why. Over the last six years, we as technologists, we've really seen prices and computing power get compressed by Moore's law. Now, when we really started to think and rethink about data, and the amount of information that we can gather, and not being limited by compute or storage, we started to rethink how would we look at data? How would we look at changes over time? What do we want to capture, how do we want to capture it?

Now, when we started looking, my last previous company that was eventually sold to Google, from LogMeIn, IoT, and talking about historical velocity of changes so customer acquisition, customer retention. All the very interesting data points was all about historical data itself. And we really started to rethink about, OK, what data is important for businesses? How do we retain it without getting lost in data warehousing, or technologies, or Kafka, or the many different historical time series technologies?

Really thinking about, how do we capture that history? So in the future, when we want to analyze those changes or behavior over time, it's possible or even plausible. So, really thinking about how businesses, and why businesses change, is really taking a look at historical data itself. And nothing's different from your personal life or your family history or ancestry. If you really look at it and really kind of want to understand change or behavior, it's really reviewing and understanding historical data.

And one of the biggest things that we noticed that a lot of the SaaS applications in the world today are very point-in-time-centric. So, really looking at a lot of these SaaS businesses, not only Salesforce, if you look at ServiceNow, if you look at all the major SaaS vendors that are in the world today, it's very much point-in-time-centric. So, really understanding and capturing that history and owning that history, more importantly, was crucial for many of the companies that we're working with today.

So a lot of these cloud vendors, you know when people say the data is in the cloud. As everyone today knows, data in the cloud today is really just locked away on other people's servers, right? So, locked in other people's servers today, and how do we free that information and make sure that it's available for our historical review which is crucial for a lot of our customers. So, when we really started to rethink that and said well, I don't want to capture just some of my history, I want to capture it all.

And there's a lot of technologies that a lot of people out there today are starting to analyze, whether it be machine learning, whether it be artificial intelligence, all of those are based on historical data. And again, really starting to think about history and seeing what else in our business requires historical data. You have backup, you have an archive, you have compliance, you have auditing, all of these are based on historical data.

Now, moving from a tactical need to strategic and saying, OK, we have tactical obligations, all of our business today require historical data. Again, disaster recovery, point in time restore, compliance, auditing, all of those require historical data. So, rethinking that and saying, well, if we capture all of our history, what other very interesting things can we do with that history in the future? Now, whether it be technologies if you're thinking about SageMaker or artificial intelligence or machine learning, all of those are founded with large historical training sets.

So, one of the things that we wanted to kind of really look at and I would love to hear what others are doing for this as well, especially in SaaS, and SaaS vendors and SaaS worlds, you guys can feel free to ask questions as we go, especially if Eric's having some more technical difficulties. Feel free to post the messages in the chat window, and I'll be happy to jump in on any of these. So, this is even before we started thinking about, OK, before we jump in and think about building one piece of code for what GRAX is or would GRAX become along these, how can we make sure all of our customers retain every version of their history, retain ownership of that historical data forever?

So, if they want to do analytics, if they want to do machine learning, if they want to do artificial intelligence, the historical data becomes essentially gold. As you get a large enough training set, back in the early days of signal variance, you have a large enough training set, signal variance can help you detect anomalies. And detecting anomalies can get in basic positive or negative indicators for your business. All of those are based on history itself.

So, one of the biggest things that we wanted to really look at is, how do we unlock that data from the major source vendors, capture every version of the highest fidelity, lock it away for customers in their storage of choice, whether it be for free technologies out there, the customer-owned S3 or Blobs Storage, or Big Objects? Really getting that data, So having the optionality in the future.

Now again, learning from the history-- many different technologies, analyzing the history-- many different technologies, whether it's Snowflake, whether it's Redshift, whether it's Quixey, whether it's PowerBI, a lot of these tools are now capable of learning from the history. Now, the most crucial thing is taking ownership of that history for our companies and retaining ownership. Taking it off the SaaS vendor's cloud, moving it into your cloud that you own and you retain forever.

Customers should own their history. History is the source of extreme knowledge, whether you choose to use it in artificial intelligence now or in five or 10 years, having the optionality to have that history to act on it, whether it's analytics, whether it's machine learning, whether it's artificial intelligence, the crucial part of that is taking ownership of it. And we've been fighting for companies and with cloud vendors to make sure that you can capture all of that data itself. It's now crossed the point where, after Moore's law, driving down the costs and really the storage is negligible for the amount of data.

When you start thinking of terabytes or petabytes of data, it is now plausible for you to store that within your cloud. So, that's one of the biggest things that we've really looked for people to really get in and understand the significance of the historical data. Not really looking at the technology itself, but really looking at and really fighting for people to understand the significance of that historical data itself. And that's really what GRAX and myself have been hyper-focused on. Darian, were you--?

DARIAN SIMS: Yeah, here, I'm not sure if you can hear me. I dialed in finally, we're still dealing with a power outage here. But I heard the last part of what you mentioned, Joe, and I want to talk about cloud-native here for a minute because that's the other interesting dynamic, is that you guys focused on being able to channel historical data into the cloud of choice of your customer. So instead of again, as you suggest holding it hostage, you're going to go ahead and put it where they want it put, and I think that gets them one step closer to leveraging a cloud-native architecture. Right?

Whether it's Amazon or Google or Microsoft or whatever, if you have your data in the right place, it can be even more accessible to all these other apps, right?

- 100%. So, the one largest movement that we've seen in a lot of our companies is, people are migrating to the cloud, and not only migrating to the cloud, a lot of people are multi-cloud companies. So, a lot of our customers not just use AWS, but they actually use parts of Google's technology or parts of Azure's technology. So there's a multi-cloud experience where our customers want to bring their own storage, and they want to choose where they're going to store the data.

Now, one crucial thing that is in the weeds a bit, but when you start capturing every version of data, and you're dealing with terabytes or petabytes of data itself, moving that data through APIs breakdown, and it becomes incredibly expensive for egress costs. And having that data as close to consumption is actually critical. So one thing you mentioned of, we want to store it as close to the customer's going to consume it, whether it's their analytics or whether it's artificial intelligence. So, it's critical that we've been trying to convince people, you need to capture the history and store it within your cloud because consuming it will become incredibly easy.

And that's really where it becomes incredibly valuable to make sure you store it in AWS or Azure or GCP, it's your choice.

ERIC KAVANAGH: Yeah, that's a really good and very important aspect here. And the other interesting thing too, and you were kind of talking about this, I think at least you alluded to it, is the format in which you store the data. You keep it in as pure a form as possible. And the reason for that is because that way when, let's say there is some incident like, I don't know, a power outage in Pittsburgh, and there's some downtime and you have to go back and capture transactions that were lost when the power went out.

Well, if you've stored that data in its pure format, it's much faster to get loaded and get running again, right? And that's time is money. So you'll save time, you'll save money, and you get right back to doing what you were doing, right Joe?

- 100%. So, you want to store it in its absolute purest form. You don't want to transform it and move it into-- The reason why we choose to store it in the lowest primitives within these clouds, as close to bare metal as possible, one is cost efficiency and another one is, we can store it in its most pure form. That means we're not reshaping the data where you lose any of the fidelity of the data itself. So when you restore, you have every piece of data from that, and there's metadata, there's all kinds of things.

So, one person had a comment here. I'm not sure, Eric, if you've seen it. If you have your own cloud, what is your recommendation to protect it? So, one critical thing there and we've spoken to a lot of different companies in the world-- So, the first thing of protecting your own cloud is tightly controlled access controls and business processes that surround it. The vast majority of issues that we've seen are governance issues more than anything, and access controls to the cloud itself. So, making sure that your governance and access control right from the start is protected.

Once you start moving and you let people in, especially early on, there's a lot of errors that can be made. If that was the biggest thing, that is the governance and access controls and policies, and setting up all the auditing and security, to begin with right up front is the number one piece with dealing with your own cloud itself. So--

ERIC KAVANAGH: Yeah that makes a lot of sense and obviously what we're dealing with here is lots and lots of data. And one of the topics I wanted to get into with you today is Salesforce, which of course is a juggernaut. I mean, it's arguably the poster child for software as a service in the business world, in the sort of post-ASP, new SaaS world, SaaS-- Software As a Service. But platforms like Salesforce evolve over time, and they take on lots of different shapes and different modules that they bring in the different things that they want to do.

But the point is, historically, they're not really set up for data egress so much, and that's why it's a very clever approach you've taken to siphon off at whatever frequency the client wants that Salesforce data, so you can basically liberate it for other applications. Because to just kind of pull it all out from what's in there, that's a bit of a messy process to get to. I've tried to do it myself, it's not very easy to do. And so, what you're doing is you're kind of greasing the tracks to future data-driven innovation, is that about right?

- 100%. And so, as all these ecosystems expand, they want to do more and more to keep you in an almost locked-down ecosystem. And you're seeing it more and more across the enterprise. You know, we saw early on with Oracle, when they had their suite of tools. When you see these other clouds that expand, for example, Salesforce, purchasing Tableau, and purchasing Slack, the amount or center of gravity of that data becomes like you said, a juggernaut for the business itself.

But having the data, the historical data, outside of Salesforce stored within your cloud of choice, whether it's your Azure, your AWS, your GCP, that gives you optionality of how and where you want to use it in the future. Now, one of the biggest things that we've kind of discussed before is, your historical data can't be-- You shouldn't be leased access to it. You shouldn't be renting your history, renting your knowledge. So that's going to be the key piece, so-- So, really capturing all of that historical data itself has been absolutely critical in what we're doing.

So, I think Eric is still having a little more technical difficulties.

ERIC KAVANAGH: Yeah, I'm still here, I'm all right. I'm going to keep the video off just because bandwidth is low. I've got my portable MiFi device here, which I keep just in case a storm comes through. We've got redundancy in our architecture here, folks. But we're talking to Joe Gaska, of a company called GRAX-- grax.com. Look those guys up. We will be diving into some of the other details about architecture, and really how you can use so-called historical data. That's what really gets me excited.

And when I first started talking to these guys, the wheels started turning as I'm thinking to myself as a marketer, as a digital marketer, there is so much information I could track on the history of prospects, of audience members, of clients, of vendors, of the market, there are all kinds of things that once you start tracking you can start analyzing and understanding. We'll pick that up after the break, folks don't touch that dial, you're listening to Insight Analysis.

[MUSIC-BLACK BANANAS, "RAD TIMES"]

VOICEOVER: Welcome back to Inside Analysis. Here's your host, Eric Kavanagh.

Take us to the future

ERIC KAVANAGH: All right, ladies and gentlemen, back here on Inside Analysis. Take us to the future is right. but if you go into the future not knowing your past, you won't really know where you're going. Right, there's a famous old quote, "If you know where you're going, any road will get you there." Well, that's fine if you're retired, or independently wealthy, or decidedly meandering through, let's say, the streets of Venice in the 1980s like I did a long time ago. But Yeah, if you're in business, you want to know where you've been to understand your trajectory.

We're talking about that today with a guy, Joe Gaska, the founder of a company called GRAX, look them up online, grax.com. And we were just opining there in the break about the history of electricity, and back when people were off the grid. And I think we're going back to being off the grid. And there's an interesting analogy we can draw to this topic of data and owning your own historical data. In a sense, being off the grid.

So, when you're on the grid, that means you have access to what you're being given, like electricity. But if the electricity goes down, then you're SOL, as they say. If you're off the grid or at least prepared for that. And it's kind of like the apps when they go offline to online, right? That was a big trend about seven or eight years ago, apps really trying to figure out, OK, how do we solve this problem of when someone goes offline because you want to be able to work in your Google Docs, for example. And all these guys are working on that now.

But we have the same concept in data, right? And if you have to pay to get access to your own data, that's going to be an impediment, it's going to slow things down, and you were talking before about API calls, Joe. And again, these are some of the hidden costs of doing business in the cloud, right? Egress costs, getting your data out, we've talked about that before, API hurdles, you won't know when you hit those limits. And when you do, it's a problem.

So, the key is you want to be as isolated as you can, meaning, have access to your data, right? You don't want to have to pay for access to your own data. And that's the original thought that you guys had. And then, now coming down to the modern-day, where we are today with History Stream that you guys have rolled out, the keys to be able to look at your history and understand it. I've mentioned this to you before just because I used email marketing campaigns like eye contact, constant contact, others-- If you as a marketer can stitch together the different pieces, the different touchpoints with a client, with a customer, it can really help you understand what they want, where they're going, what's important to them.

And that's what you're enabling now with this history stream concept, right, is you're enabling historical data to be used as an asset, to be able to plot the lines and understand where the customers are going, and then be able to predict where they're going to go. Is that about right, Joe?

- 100%. Now just to finish one other thought, to bring it back to exactly where you're going with some people that we're having this discussion, where offline is. So, if you think about it in kind of-- Electricity was invented about 1879, and it wasn't prevalent in every household until about the 1920s, 1925. Now, if you think of that in relationship to the birth of the internet in the 1980s, and relative to where we are today if you think about how much of a disrupture electricity was.

Back in the times where we are today, when you think of that relative to the internet and the different economies, then the different pieces that are going to be unlocked that. We were just at its infancy. Technology is still, even though we think that technology today has really, just because it's been exponentially increasing in computing, we're just still at the very beginning of it. So, historical data in how it's used in these technologies, even with artificial intelligence and machine learning and taking ownership of that data, is also protecting your optionality in the future, right?

So capturing all that data as different technologies, whether it's analytics, whether it's reporting, whether it's machine learning, all of that history is really the basis to train or to give knowledge to those applications. So, the importance of taking ownership of that data and maintaining your chain of custody in your direct ownership, it's one of those things that I'm incredibly passionate about. It's something where people should never be leased access to their data. When we say we want to protect our customers, I meant that full-heartedly where customers themselves maintain that ownership forever.

And that's really to protect them both now with these tactical obligations of backup, archive, restore, all those insurance policies that no one likes to buy. But having that data in a place that you own, again, like you've just mentioned before. When this data starts accumulating, and you're doing a daily backup, or you do a backup twice a day, or three times a day, or five times a day, and you're capturing all the differences in the data. It starts to become a training set to learn how your data changes over time.

Trying to move that through APIs, as you were just opening up and talking about, it's amazingly expensive if you have to transmit that from someone else's API to get it into your analytics tool just to do a report.

ERIC KAVANAGH: That's right.

- Right? And that's really where you might say, well Joe, I can just go to Salesforce and get my history. Well, you can't because you have one version of the record. Once that one version of the record replaces the previous version, you've lost that change. Not only that, all of these SaaS vendors for compliance and regulatory reasons, they don't want to store your history because it increases their liability and risk.

ERIC KAVANAGH: Interesting.

JOE GASKA: Right, so--

ERIC KAVANAGH: That's a good point.

- So they, actually, want to purposely say no, we don't have any in your history because if anything bad happens the federal government can come in and say, hey, give us your historic data because they can say no, sorry, I don't have it for that business. And it becomes important. So, that's one of the key places that we've said we're very, very, very, obviously, passionate about this is, historical data is a wealth of knowledge. We've seen the biggest thing that we talked to-- It's not that difficult to conceptualize.

Everybody today, in every business uses historical data to answer questions. What did my business look like last month compared to this month? What did my case volume look like last week compared to this week? What does my pipeline trend look like? What does my pipeline velocity look like? All of those questions are answered through historical data, or they want to be, or they're very difficult. Having all of that data in one place, answering those questions are just choosing the tool you want to answer the question with.

Whether it be QuickSight, or whether it be Snowflake, or whether it be Tableau, or whether it be PowerBI, having the data is really just choosing the facade that you actually want to answer or facilitate that question through. And that's one of the biggest things, is really not just protecting yourself from disaster, but also protecting the optionality in the future. And if you want to load that data into a data warehouse and today you too Snowflake, tomorrow you choose Redshift, that's a business decision. All of that data and all of that historical value should be retained by the businesses themselves. You're back.

ERIC KAVANAGH: Yeah, yeah no, that's a good point. And I'd like to also point out when it comes to analysis, it has to happen quickly, right? If there are long gaps in the amount of time it takes you to ask questions and get answers, that you're just going to lose interest and do something else. And I think this is what a lot of people don't recognize. In fact, when I was talking to that reporter from TechTarget, just last week about what you guys are doing, he was asking me, well what's happening today that is not in line with what people want?

In other words, what is GRAX disrupting with this technology? And I said, well you can still do the kinds of things that GRAX is enabling the old-fashioned way, it just takes a lot more time and a lot more effort. And you wind up with data engineers that spend all day polling data out of this, system, transforming it pushing it into that system. It's a time-consuming process. It's error-prone. And it's also expensive. It's expensive in terms of egress costs, to your point, it's expensive in terms of human resources.

But I think the most important thing is, it's expensive in terms of boredom. You're really, you're driving your people into the ground if you for them to manually put rivets in cars all day the way we used to. Well, machines now build cars because, guess what, they're less error-prone, it's less dangerous. You don't have human beings getting in between these big moving parts. And it's also more efficient. Well as a general rule in the world of Business and Technology, if a machine can do a job more efficiently than a human, it's time to hand that job off to the machine as long as it's cost-effective and efficient and ethical, and so on and so forth.

And that's what we're seeing here. So, you could have done all this sort of historical analysis, but it would take you so much time to pull data from all these different systems, collate it, get that correct. You said we used to put these in the data warehouses, we still do. Snowflake is obviously a data warehouse in the cloud, but they thought through some of the core foundational components. And what they did is separated compute from storage. Right, what you're talking about is kind of separating storage from the prison of someone else's cloud, right?

So, it's this abstraction that frees you to do new things. And that's what you guys focused on. You focused on abstracting out the historical data, allowing the end-user to put it in their own cloud, so they can access it whenever they want. And what excites me is thinking about the next year or two, when you'll be able to provision systems like email marketing, like social media monitoring. All of these kind of things you could do now, but it's going to be a lot easier. It's all about greasing the tracks and getting people to the analytical process sooner so they make business decisions that help the organization, right Joe?

- Yeah. You couldn't have hit the nail on the head more. We want businesses focusing on answering business questions, not the plumbing.

ERIC KAVANAGH: Right.

- It should be as simple as a couple clicks to say, I want all of my historical data in tool of choice. Tool of choice being, OK, if you want to use Redshift or Athena or QuickSight or PowerBI, we should be able to simply say, I want all of my historical data there, and it's continually streaming into that application and updating. One thing you said that's crucial, this data needs to be continually updated as fast as possible with the highest fidelity, the most data points because people want to look at the data differently.

What you don't want to do is, every company doesn't want to have plumbers. You don't want everybody hooking this up, having these big ETL-heavy processes. There's a lot of great standards that are out there. And a lot of great technologies that will make it a simple point and click and the data is consumed. And that's one of the biggest things that we wanted to do. In a matter of minutes, you focus on business questions, not the plumbing. There's no coding. There's no coding of anything that we're talking about.

With a few clicks, all of this historical data can be ready to answer a business question using a visualization tool like QuickSight, or using PowerBI, or using SageMaker. Focusing your attention and really about your resourcing on business questions that are crucial for your work, not the plumbing. That was one of my biggest things is, we were selling millions of dollars of people spent, grinding it out, making sure they're moving data from one place to another, and making sure that it's reliably monitored and controlled.

That's really where we are trying to focus on alleviating that pain. So, it's just, hey, I want to answer this question using historical data. It should be simple and easy.

ERIC KAVANAGH: And you made a really good point too about when an update occurs in Salesforce, that record is now updated. And unless you have a copy of the historical data, you won't know that it was different. You can't go into a time machine inside of that environment, at least not yet, to access that. I mean, some of this is changing with these append-only databases, but the old model of doing things was to overwrite stuff.

So, one of my favorite tools is the slider bar, and if anyone has never used this it's a lot of fun. You could probably go online and just do some searches for looking at data with a slider bar of history. So, you move the bar left to right and let's say it's the dimension of time, and you want to watch different housing projects over the past 20 or 30 years and how much money went into all those. That's extremely revealing to watch how things develop over time.

In fact, you can't even understand any sort of significant business concept if you abstract out the time component because time changes everything. When upgrades to software come out, that changes how your computer works. I mean, on this call I was trying to do something on my Mac, and I had upgraded to the new version of the operating system, but of course, they changed how I view things now. So, I'm like, OK, I'll have to learn how that changed.

That's the one thing with the cloud that's a little bit frustrating, is that it just changes, and then you have to go along with the flow and figure out what changed. But the key to get back to the main point here is this historical data. So, what was the activity on the Acme account, for example? Who first touched with these customers, or with this prospect? Who got into the mix?

And what Joe is trying to describe here, folks is if you have access to your own historical data in a tool like Salesforce. And you can analyze it in some other tool because Salesforce wasn't really designed to analyze data in that fashion. It was designed as an operational system to help your salespeople stay on track of who's doing what, whom to call next, all these kinds of things.

But from a business perspective, you might want to take a step back and think, well hold on, who was involved in the Acme account of the early days? Because things were going really well then and something happened six months ago. Let's log in and see, oh, this other guy joined the company. He was touch, Oh, OK. Let's have a conversation with Bob because Bob apparently doesn't know certain aspects of this account. The point is if you have that historical data you can go back in time and look and watch how things change.

That can make a big difference. It can make the difference between losing an account, or keeping an account for another year. Maybe it's a $50,000 account, not a small thing to not worry about, folks. This is why historical data matters, but don't touch that dial, I'll be right back. You're listening to the only coast-to-coast show about the information economy. It's called Insight Analysis

[MUSIC-BLACK BANANAS, "RAD TIMES"]

[MUSIC-MICHAEL KIWANUKA, "ROLLING"]

[MUSIC- BLACK BANANAS, "RAD TIMES"]

VOICEOVER: Welcome back to Inside Analysis. Here's your host, Eric Kavanagh.

ERIC KAVANAGH: All right, ladies, and gentlemen. Back here on Inside Analysis, talking with Joe Gaska of GRAX. We got some big questions coming in from the audience. And folks, if you're driving down the highway listening to the show and you're like, wait, how do I get backstage for Insight Analysis? Send me an email-- info@insideanalysis.com. Just register for one of our webinars. We use Zoom to record and broadcast these shows.

And of course, we archive all these shows as well. That's our historical data, all the shows we've done in the past. We had a couple of good questions come in, one around governance, and one around metadata, and how metadata is essential. Yes, it is. Metadata is the glue that holds information systems together. And I think, just to get back to the topic Du Jour today with Joe, what's really important here about your historical data is understanding your customers, understanding your trajectory, understanding your sales history, for example.

Because a lot of times we forget to look in the rearview mirror and really understand what happened. And if you have access to the data, and it's easy access, then you can map it into some analytical tool like Joe's been talking about and look for the patterns in the data. And you'll see these seasonal patterns that'll show up depending upon your business, maybe it's monthly, maybe it's weekly, maybe it's annual. But you're going to find something in there if you have easy access to that data.

But if you don't it's going to be trouble. But as regards governance, I mean, one of the nice things about the cloud and metadata is that the cloud just kind of captures all this metadata dynamically. It was built to capture the metadata. Who logged in at what point? Which system they touch, Which data that they touch?

That's something called DataOps. We talk about that a lot on the show as well. And then, of course, the metadata, like we say, that's the glue that holds it all together. But you can't do a whole lot with that if you don't have access to your own data. And that's what I think gets me excited. But what are your thoughts, Joe, about governance, baking good governance in, and using catalogs, and what is this about metadata from your perspective?

- Sure, there's some great questions out there. These are things that I couldn't be more passionate about. So, as I was saying before, not having to focus on the plumbing and focusing on answering the business question, the typical business users aren't going to get into the data catalog and be able to form a question. Then how do I aggregate, correlate all of this data to provide an answer for that question?

Having enough resourcing and enough business analysts inside that can help assist in building the dashboards or customizing a dashboard for the business users to get that question answered earlier and quicker, and that is critical. We are getting closer to the point where these analytical tools and these visualization tools are becoming very easy. When you start getting into historical data in aggregations and making sure things aren't double-counted, having the right business analyst or business support to help answer and facilitate and bless the reports is critical.

Because the one thing you don't want to do is you don't want people to have false positives with the data, or they lose trust. So making sure that you have support to give the answers you're focusing on the questions is critical for the business itself, right? One of the things, the questions here is, for example, year-to-date numbers? Now, year-to-date numbers, we think of velocity or pipeline velocity or pipeline stages for sales. You want to make sure you don't double count any of that historical numbers.

So, how do you aggregate and how do you sum up the numbers? Making sure that you're not overshooting the numbers is critical. So, making sure that all of the reports and dashboards, one-- you make it very easy to consume and get. And what does that mean? Tools like QuickSight or PowerBI or Tableau, your business users want to consume it in their tool of choice. So that becomes critical to bringing it to where they are. But obviously, the business support, you still can lose that you have to have them. You have to have a way to get support to build reports and dashboards, and that's absolutely critical.

One question that someone had about good metadata is essential. That is such a critical piece. These databases, whether it's in the cloud, it changes over time. So, whether the structure of your database is changing, where a field's definition has changed from one month to the next month is absolutely critical. So, making sure that all of that data is understood both from the business users. GRAX captures every version of metadata and every version of data in your instance. So think of every piece of data that you want, pulls out, and puts into your cloud of choice.

One person asked, how could you back up more of the information and how often would you need to update the historical data? Awesome question. So, this not only changes from object to object, but it can actually change from field to field. So, for example, let's take the highest expectation case. If we are a medical device manufacturer and we are putting all of our data in, let's just call it like a case object where someone is calling up for help and you're giving them recommendation. That company actually has to store every version of a case. So you're not capturing just a once-a-day backup, you're capturing every single version of that case to make sure when you audit that, you're getting the information you can provide.

So, people now, not only regulatory compliance, the needs for capturing every version of that data and backing up to your historical repository is really a dial setting per object. So, if you really want to capture every version of the data for compliance and regulatory needs, you can do it. We have some customers that want to capture that version of data so they can understand how it's changing over time. For example, if someone modifies a sales order for quantity, they're updating their manufacturing account. So they want every version of that to do just in time reporting.

Every version of the historical data you should really think of as a continuous stream going into where you keep it. The consumer's out of it, whether it be analytics reporting or downstream business operations, they're going to have needs, but you're going to want to capture that the highest fidelity because there will be different needs downstream. So, that's been one thing that's been critical is, now that data storage is diving down to next to nothing, I tell people, store it at the highest frequency you possibly can think of needing.

And store every version of every field on the object itself. Because in the future, you don't know what your analytics needs are in reporting. And to tell you the truth now, the cost of storing it is negligible. Now, one other question here-- So, one user has asked me about, are we talking about just historical in the cloud, just in SaaS, or on-prem as well?

So, a lot of the pieces that we've hyper-focused on is getting the point-in-time information from the SaaS data to storing on the cloud itself, I mean, in the customers' cloud of choice. Whether it's internal data or cloud data, it's really the same. It's really the same piece whether you've moved it to the cloud or you've kind of kept it, whether it's on-prem or cloud, that historical data has significance. Putting it in the place where it could be consumed, in our mind, there is no differentiation.

We've hyper-focused on these cloud vendors because when we started doing analysis with a lot of the analysts today, we really looked at how long they were locked away. In that cloud, we wanted to capture everything. But I agree with what everyone's saying. The metadata is-- So, metadata for a lot of people can mean a lot of different things, but we treat metadata as just data itself. So, we capture every version of the metadata being a schema definition, or metadata being, when you log in when the records changed, all of that data to us is critical, and the more you can capture the better.

And some people look at metadata as, hey, how often does to do users log in? Some people view metadata as my schema definition.

ERIC KAVANAGH: And you know what, schemas change over time, right? And that's not a small change. If your schema changed, that can change your perspective. So, this is also very good of course for troubleshooting, for rolling back in time like, wait a minute why are my numbers different today? , Well maybe someone changed the schema on you. That's not an insignificant event.

And so again, if you can roll back and this is the whole concept. We've got about a minute for the live show before it's over, but if you can roll back in time to when things were working well, this is what excites me. So, real quick Joe, with historical data you could see, for example, we had a peak in sales last September, what was going on there? You can dive in and figure out who was involved, what happened, and then you can try to recreate that magic, right, real quick?

- Yeah, so you can absolutely dial back in the data, but you can also dial back on how the scheme had drifted. So, the metadata itself, you can also see how the data changes. But if you think of the metadata, you can see how the schema is changing over time. So if you created, deleted, added, modified all the different fields, you can see every version relative to the data. So if we have 15 versions of the data, they're all related to their metadata reference themselves. So we have the full definition of the schemas and whatnot. For some really complex questions, it's critical that you have the data itself.

ERIC KAVANAGH: That's right.

- And if you want to recreate it. So, if you want to say, hey, I want to go back in time to this version of the data--

ERIC KAVANAGH: We'll pick this up with the podcast bonus segment folks. Send me an email-- info@insideanalysis.com.

OK folks, time for the podcast bonus segment. Here on Inside Analysis, talking to Joe Gaska from GRAX. We were right in the middle of a point there about schema. And of course, the schema is like the model that you build to understand your data. And schemas will change over time. Maybe someone adds in a new field, like customer lifetime value, for example, that's an important thing to add. But if the schema changes, then the shape of the data is going to change. And it's very difficult to understand yesterday's world through today's schema unless you have that all tracked.

Unless you can go back in time and see when those changes took place. But Joe, why don't you explain why that matters, and why the metadata matters, and why it's important to be able to go back in time to see when things changed?

- Great. people know this pain, and they've dealt with it before. They know how complex it can be. So, as your business evolves over time, so does the business needs and how they use the systems. Now, in today's SaaS world, it's so easy to customize and modify that which, actually, can have detrimental effects on your business. And to really go into some of the very techie things is, you can change a field type, you can add a field type, you can delete it.

You can, once you delete a field, you can recreate another field that might have the exact same name that you used to use a year ago for the definition as you have now. So, all of those schema changes over time is very critical in understanding your data relative to what you want to do with it. And there's different pieces to it. If you want to recreate the data-- Now, we have a thing called Time Machine inside of Salesforce or the SaaS tool, you can see every version of a record. So, it didn't matter if it happened in 500 different backups, or you ran it twice a day. If you look at a record you can see every version it doesn't matter what it contains in there. But they can go back and restore a previous version or view it or access that data itself. It becomes critical in knowing the schema definition is one of the biggest pieces.

ERIC KAVANAGH: I could tell you saw that text just came in. Thanks from a former victim of software as a service. I mean, I'm sure you talked about some of the stuff earlier, and we only have a couple of minutes left though Joe, but all it takes is getting burned once, seriously burned. And then you learn the lessons, right? And then you're like, OK, I need to get my house in order here and make sure I'm on the ball with this kind of stuff. Because one unpleasant experience will teach you that lesson forever and ever.

And what you want to do is you want to learn the smart way, right? Don't learn the hard way, learn the easy way by listening to shows like Inside Analysis and listening to guys like Joe. Because once, as he described in a lot of these cloud-based solutions, number one, it's your responsibility to back it up. That's number one. Number two, it's your responsibility to make sure that you can get access to all that kind of stuff and it's also your responsibility to keep it secure, right?

The cloud they have this concept of "shared responsibility". I'm doing the air quotes there. And shared responsibility means it's your responsibility. It's basically what it boils down to. And it just kind of tying in some of these other themes like metadata and governance, all these other topics we've been discussing. The key, I think, for a business analyst to be able to get up to speed and understand what's going on. And that's what we're talking about. It's enabling the business analyst to get enough perspective on the corporate data to make intelligent decisions.

And the key to do that is to be able to see the trajectory, to see how things are moving. What is this? And we get easy stuff like, year over year, it's up 20%, it's down 20%. That's good high-level stuff. But you have to get into the weeds to really understand specifics on things, on specific accounts, like account-based monitoring or account-based marketing as they call it. That's just as a last comment, maybe the last thread to dive into.

That's a really powerful use case for what you folks have created. Account-based marketing, allowing the salespeople the marketers to look at the history of this particular prospect. When did they come in? What did they do? Who else was related to them?

All those different questions can be answered if you have a view of your historical data. And then your salesperson makes a whole lot more sense and makes a lot more salient commentary when on the phone with the actual client. Because that's the thing. You have to do as much homework as possible before they get on the phone. And then, after a good salesperson with the right data, they're going to make that sale or at least get closer to the sale, right? But if you want to blow a sale, just ignore your history, ignore your historical data, and pick up the phone and call that prospect.

That's when things go south pretty quickly. I guess just kind of wax philosophic on that for our last couple of minutes, Joe. It's so important to know who your customers are, to understand what's important to them. And if you do that research, you're going to have a really good chance of reselling, upselling, and keeping that customer around, right Joe?

- Absolutely, and if there's one thing about dive into that one little bit further. you're 100% right in every use case that you said, and everyone can kind of theorize about the value of historical data, and why my business needs to use it? To answer very pertinent questions like that. The one biggest thing that I've just been really preaching to people about is when we wanted to build your own, bring your own cloud. I really believe that's the way the world's going, is only owning your data.

Don't move it from your SaaS applications to another backup software.

ERIC KAVANAGH: Right.

- You're just moving it from, hey I want to take it from Salesforce and I want to pay another subscription to another cloud subscription cloud vendor, where I don't own. So, there is lots of players in the market. You can look them up. There's ones like Veeam or Druva. There's the big established ones there, and they have these large clouds. And what they do is they take all the backup data and they lock it in their cloud, and then they rent access back to it. You've just moved it from one cloud to another and I still don't own it, right?

So, answering the historical question is, when I say bring your own cloud, storing it in the bare primitives of your cloud, whether it's Amazon, Azure, GCP, IBM, if you use some of those clouds, retain that data forever so you have the option to answer these business questions. Right? That's the biggest thing one thing that I would say is, all the historical data today, a lot of people don't think about historical data because it's not a possibility.

ERIC KAVANAGH: Right.

- It doesn't exist in these clouds to say. They want to ask questions. They want to ask business velocity questions, but a lot of people think that it's such a large chasm to get over to answer these questions, That a lot of people just say, I've been burned by this before. I'm not going to build a data warehouse and spend millions of dollars. I'm talking clicks. It takes more to think of the business question than it does to actually build the answer for it with a few clicks. Not code, not ETL, the business question is the toughest piece of, OK, what question?

And it can be pipelined. I can tell you that we could probably talk about the 15 business questions that every customer wants to ask. And they all revolve around revenue acquisition or customer retention. Those two camps are all about protecting revenue, protecting your business, tracking business velocity, customer acquisition, there's a whole other piece of business operations point. But the majority of people right now about customer acquisition, customer retention, historical because you want to look at the risk based on my history of customers leaving us.

What Is that? Is that the number of cases that are created per month? Is it the driving down of the opportunities relative to the cases increasing on when they're good indicators of attrition?

ERIC KAVANAGH: That's right.

- All that historical pieces. That's one of the biggest things then. I cannot focus on more.

ERIC KAVANAGH: Yeah, that's. Such a great way to end the show and you reminded me of a couple of different things here. But folks look these guys up. It's GRAX.com, and that point you made there at the end is really critical, in fact, I'm kicking myself for not finding a way to articulate that earlier. These are the key metrics to watch. Where is the money coming from? Where is the money going? How well is customer service working?

These are really basic things but you'd be surprised, especially if you have a lot of turnover, you got new people coming in. You want to educate them as quickly as possible, folks. And there's no better tome for them to research than your own historical data. That's what you want them looking at, is understanding where you've come from, where you are right now, and then ascertain where you're going. That's what you could do with historical data, folks.

Don't be a stranger. Send me an email insideanalysis.com. We'll talk to you next time. You've been listening to Inside Analysis.

See all
GRAX

Join the best
with GRAX Enterprise.

Be among the smartest companies in the world.