Description
Join Laurence Boyce, Sales Engineer at Gearset, as he breaks down the importance of a lifecycle approach to Salesforce data management. This session is the first of a three part series, and focuses on the key phases of data management, from data collection to archiving and eventually permanent deletion.
Next in the series:
- Episode 2: Strategies for safeguarding your Salesforce Org
- Episode 3: Optimizing your Salesforce data management
Learn more:
- Gearset’s Salesforce backup solution
- Gearset’s Salesforce archiving solution
- Salesforce backup: The complete strategy for data protection and recovery
- Why backup & archiving are key to confidence in the operate stage of Salesforce DevOps
- Who’s responsible for your Salesforce backups?
- Salesforce backup and recovery best practices for a reliable backup process
- How to create a Salesforce data archiving strategy
- Success story: Granicus
Relevant videos:
Transcript
So thank you very, very much everyone for joining.
So to give you a quick overview, we're gonna have a three part webinar series, over the next, well, today, two weeks' time, and then two weeks after that. And we're gonna be talking around all aspects of the data life cycle process.
In terms of what we're gonna run through today, so we've got episode one, of course, end to end Salesforce data control. You will also notice for anyone joining, there is a q and a function.
So any questions, thoughts, concerns as we go through, please drop them in the chat. You'll see we've also got Serena and Aga who, work in our, event marketing as well as product team, respectively. So they'll be there to answer any questions, but also will answer any other questions live at the end.
Unlike some webinars you might see around, this isn't prerecorded. It's real. It's six zero three PM on Tuesday, the fifth of November. So, yeah, it's real life, so your questions will be answered.
So let's get cracking then. So today, we're gonna have that part one. So around end to end Salesforce data control here and a life cycle approach to Salesforce data management.
So after its employees, data might just be an organization's most valuable asset, with some saying that data is like the oil of the twenty first century.
And for any team that creates, stores, and uses data in their Salesforce org, which I'm sure given nature of this series is everyone here, ensuring your data is stored and used effectively is critical to your ability in making informed decisions on how to innovate and ultimately grow.
So in the next sort of twenty five minutes or so, the goal here is that we'll understand the phases of data life cycle, specifically in the Salesforce context.
We'll look at some of the considerations and the management strategies to handle this also.
And I'm not here to tell you to go and do any of these or specifically what's a right or wrong approach, but I aim to inform you of a few strategies and commonly overlooked aspects within this. We'll also be showing a brand new solution that Gearset are bringing to market that not many people have ever seen.
So before we get started, quick intro to myself. I'm Lawrence Sizer in the technical team here at Gearset, supporting customers with their evaluations right through to trials, proof of concepts, then into delivery. I know a few of the attendees here I'm working with at the minute, so good to see you. I'm I'm looking forward to working with you going forwards.
So firstly, let's establish together what is a data life cycle management process.
We've all heard the stats about data volumes increasing at an exponential rate, especially in the cloud, where it's predicted that by the end of this year, there'll be over a hundred trillion terabyte trillion gigabytes of data stored. And so we see this in the Salesforce context with reports and cases and integrations as well. But having more and more data is great, but maximizing the effectiveness of this is by no means a simple task.
So organizations need a strategy to define how to manage data throughout its life cycle from creation to deletion and all those steps in between. And that process is commonly known as a data life cycle management or DLM process.
So based on this, it's totally fair to ask who should implement a DLM process. And while this should be considered for all organizations, it's especially such for those with a lot of data flowing into Salesforce causing large data volumes, perhaps due to external integrations, and those subject to regulatory or compliance requirements in sectors such as health care to central government and maybe financial services to pharmaceuticals as well.
And these regulations such as GDPR and HIPAA are expansive, but failure to comply can lead to severe legal, financial, and reputational damage.
So let's specifically talk then around the Salesforce context.
And just a few benefits for implementing a Salesforce specific data life cycle management process.
I'm sure many of our orgs contain sensitive data, whether about our contacts, business partners, or internal employees.
So a thorough DLM process will build security into your processes safeguarding information and providing access only to those required.
Gartner have predicted that over thirty percent of Salesforce data is out of date within twelve months from its inception.
An example could be a contact leaves the company, changes role, or has their details updated.
So over time, as data in Salesforce becomes outdated, you want the data that your users interact with to stay relevant.
So DLM processes define the review, cleanup, and quality maintenance of data, providing confidence to your users when driving that decision making.
And it's commonly known that Salesforce data and file storage can costs can be enormous and often preventatively so. So having a strategy to offload data from Salesforce, freeing up storage space while keeping this avail this data available would also fit within a data life cycle management process.
And large data volumes within Salesforce can also result in severe performance degradation to your org with the impacts on maybe page load times or slower reporting or even record blocking.
So as with the previous point here, data life cycle management processes will support the periodic removal of data from Salesforce to avoid unwanted impacts on your org. And although this isn't a comprehensive list, these are just a few examples of why a plan is valuable for Salesforce while balancing security, system performance, and cost optimization.
So you you might be thinking like, okay, Lawrence.
Fine. I'm bought into the why. But what actually are the steps to consider in a DLM plan?
And if that's what you're thinking, I'm delighted that's what you're thinking because that's what we'll look at now.
And there's a few commonly used processes varying in their number of stages, but all follow a very similar concept.
And that is from managing the data from creation to destruction.
And specifically for Salesforce, I believe the five stage approach leveraged by companies such as IBM and Shopify, to name just a few, is likely a good balance of not overcomplicating the process, but it maintains a suitable thoroughness.
And this five stage process boils down to creation, processing and storage, through then to actually using that data and driving analysis, archiving that data and offloading, as well as then the permanent destruction for regulation and compliance requirements.
So let's focus on each one of these in turn.
Data creation is where it all begins, and this can occur in a number of ways. Whether it's a support team member assisting a customer with emails and cases, marketing uploading contacts from an event, or your dev team building new CPQ configurations, for example.
The possibilities for data creation are endless. And this is the fundamental source that drives all downstream activities and business decisions.
So while it's often appealing and possibly simpler to gather as much data as possible, it's also important to consider the input quality and importance of the data to ensure this aligns with your business objectives of what you're trying to achieve from bringing that into Salesforce itself.
So although there's lots of methods for creation, let's look at the next stage, processing and storage.
And although on the surface, collecting, processing, and storing data sounds pretty straightforward, this step can involve a number of aspects from wrangling and cleaning to encryption and transformation.
For example, when data originates from a wide variety of sources, many of these transformation techniques are required, like what we see in data cloud, for example.
Additionally, compliance and regulatory requirements are becoming more and more prevalent, and the impacts of breaking these becoming increasingly severe.
So with this in mind, I'm gonna focus on a concept called the CIA triad, which serves as the foundational principle for IT security in data carrying systems such as Salesforce.
So CIA stands for confidentiality, integrity, and availability.
And confidentiality then so not all users require access to all data within your orgs. So this involves implementing access controls such as SSO or MFA and the classification of data to ensure policy of least privilege.
Of course, in Salesforce context, this translates in many ways such as implementing role hierarchy, sharing rules, permission sets, field level securities, etcetera.
And these all provide that fine grain control over access to data.
And moving on to integrity then, of course, providing all users with the confidence that the data they access can be trusted is critical when making strategic decisions. So integrity involves proactively managing the accuracy and completeness of this data throughout its life cycle.
And and the data can only be leveraged to drive decisions if it's actually visible to the required users. So availability encompasses maintaining that system's reliability through to disaster recovery and incident response planning to avoid any business disruption.
So this brings us nicely onto our next topic here is just storing data within Salesforce is not enough.
A critical aspect of any data life cycle management plan is a comprehensive data backup and restore solution.
So while we're here, I'd like to highlight a few commonly misunderstood and overlooked aspects of data storage within Salesforce.
From sales targets to marketing campaigns, Salesforce underpins core business functions and revenue streams. In fact, ninety eight percent of teams report Salesforce is critical to their business objectives.
So while so with Salesforce at the heart of your organization, you don't wanna risk any disruption to your orgs.
And although some believe that Salesforce is immune to losses because it's cloud based, this is unfortunately just not the case.
And one thing that we hear really commonly is there's a backup in Salesforce. Right?
And this just isn't the case because Salesforce subscribes to something that's called the shared responsibility model.
And that means that while Salesforce are responsible for the security and availability of the platform, you are responsible for ensuring that your data stored within Salesforce is protected.
And this means that you and you alone are responsible for backing up your Salesforce data and metadata and ensuring that you comply with corporate and industry regulations, which you're subject to, which depending on the industry you're in, could be extensive.
So despite the real risk of Salesforce data and metadata loss, the majority of Salesforce teams still don't use backup to protect their business critical data and functions.
And there are a number of tools, for backup available, purpose built specifically for Salesforce. And some of you may know Gearset provides just one of those. So if any of these resonate with you, I thoroughly recommend checking it out.
We'll also be covering backup in more detail in episode two of our webinar series. That's strict strategies for safeguarding your Salesforce org.
But it's at this point, let's move on to stage three, usage and analytics, where data is used to build up the customer three sixty view with reports and dashboards providing meaningful insights to propel your organizations forward.
And although this is clearly a huge area of the DLM process, the vast variability of this based on organization size and business model, processes, and industry I mean, I'm gonna focus on a more commonly overlooked aspect of this, which is ensuring your development team members have access to production like data when they're building their customizations.
And at Gearset, we often hear that the first place metadata changes get tested alongside data is in UAT, maybe just before released to production.
And, of course, this means that if any errors are found, the features have to be totally refactored back in the development environment, which at best is time consuming and frustrating, but could call but could cause delayed projects, missed deadlines, and lost revenue.
So to address this, teams use maybe sandbox sealing tools to test development work on data early in the process, giving significant benefits.
So it's not that straightforward to you with a constantly evolving data schema to have that data in lower environments. So what are some commonly used solutions to address this?
First one that we see fairly often is sandbox refreshes.
And this sounds great, but with the varying sandbox types, their associated refresh cycles, and the challenges to day to day activities that refreshes can cause, this is often not feasible.
We also see custom scripts being used, but these are often time consuming to build, run, and maintain with your constantly evolving org. And maybe not all team members are equipped to be able to run these scripts.
And this is where, really, as mentioned just a second ago, a dedicated sandbox seeding type of tool comes in, providing a streamlined method for moving production like data with your latest schema into your sandboxes without the limitations of refresh cycles or the burden of a script based approach.
But commonly, the most beneficial aspect of a seeding tool is masking or scrambling, applying an anonymization layer to the PII data within your orgs, not only ensuring that your developers have a safe production like data to test on without the risk of accidentally triggering automation such as sending emails, but this also ensures you stay compliant with regulatory measures if you have external parties such as contractors or SIs working in those lower environments.
And as with backup, there are a number of sandbox saving tools available specifically built to handle the complexities of Salesforce.
As with backup, one of those options is, of course, Gearset two, but we won't look at that in too much detail here.
But while we're on the subject of usage and analytics, I'd like to take this opportunity to show you folks who have hopped on this evening, this lunchtime, or this morning, depending on where you are in the world, a brand new solution that Gearset are bringing to market that not many people have seen before, and that is the data dashboard.
So just think for a second.
Have you ever looked at your Salesforce storage utilization?
I bet some of you have. And now have you ever tried to figure out how that number has changed over time?
Or have you ever tried to predict how long you have left until Salesforce comes knocking at your door as you have a potentially a bill because you've reached a hundred percent?
Earlier this year, Gearset launched archiving, and we'll touch a bit more on that in a moment.
And while chatting with so many of you, we learned that storage is not something that's easily understood and easy to plan for. But it can cost a lot of money, and you might not see it coming.
And this is why we built the data dashboard, and we're launching it in pilot to our backup users this week so it is real.
So let's move into Gearset. I'll give you a quick live demo.
So you should be able to see Gearset now.
Please message in the chat if you can't.
But our data dashboard, as we can see here, I've moved from the data backup job screen that you may be familiar with into the dashboard, and this will give you your historical backup. This will use the historical backup data to show how your storage usage has increased over time against your overall limit that you have with Salesforce.
So this can help you plan and analyze your data and final utilization also and plan your budget accordingly.
The data dashboard also encompasses the fastest growing objects, and we can choose over how long we want to view this. And this also combines with the object analysis in Gearset at backup, so you'd be able to dig into what's happening into your org, resolve issues, or use the data dashboard to have those important conversations with your peers on whether you need to move the data somewhere else or plan a cleanup or buy additional storage or do something else.
And we're excited to bring the data dashboard to our backup customers and grow this feature further in the coming months, and we'd also love to hear your feedback on this. So if you've got any questions, please add these into the chat.
On the right hand side as well, what I failed to to mention is here is you'll see how the org is growing over time. And in a slice that might be going shortly, that will then be a future looking thing as well to help you in that understanding as well.
So at this point, please, as I said, any questions, thoughts, concerns on that, please drop those in the chat. But let's pivot back into where we were in the slides here, and then let's look at the next stage and perhaps the most commonly overlooked part of the DLM process, archiving.
So an archiving process removes data that is no longer required in the org that can't be permanently deleted, perhaps due to compliance requirements or desire for it to be accessible ultimately if required.
And specifically in the Salesforce context, there's a number of reasons why defining your data archival strategy is beneficial.
All customers are limited to the to the volume of both data and the files they're allowed to store within their Salesforce instances.
And this roughly correlates to how much they spend with Salesforce.
And as such, when they're when customers are coming towards their storage limits, they're presented with a few options.
Delete the data, purchase additional storage, or do something else.
And for many customers, regulation and compliance, along with unknown future costs and performance degradation mean option one and two are just not viable.
So this leaves option three. If Zoom background doesn't chop my fingers off, there we go. And option three then is doing something else with that data, and that's where archiving comes into play.
But it's not just the storage limits that Salesforce imposed that mean an archiving strategy should be adopted.
Stale or out of date data can influence reporting and analytics leading to uninformed decision making.
Proactive archiving also reduces the risk of inactive data being exploited or leveraged by attackers to gain access to your Salesforce systems.
And finally, the performance degradation that we mentioned earlier, automating that process of removing data from Salesforce will avoid unwanted impacts on page load times, slow reporting, or record blocking.
And an archival process can vary in its complexity, but commonly boils down to what data is critical to remain in Salesforce and what is stale or should be removed but can't be permanently deleted, what data is consuming the majority of storage space, When does that data need to be end of life? So you can apply the appropriate retention policies.
And, of course, evolving the archiving process as your organization evolves over time.
And there's also a number of specific Salesforce archiving tools, and, of course, provides one of those as well. So that will also be covered in episode three of our, webinar series here, which is optimizing your Salesforce data management.
So let's move on then to that fifth part of the DLM process.
As we've discussed a couple of times, regulation and compliance requirements such as GDPR, HIPAA, or PCI DSS impose strict guidelines on the minimum time that data should be stored and ultimately when this should be permanently purged.
However, in a highly relational database like Salesforce, removing all existence of data not only from Salesforce itself, but also any other backup or archiving storage locations can be incredibly challenging.
But there's also some purpose built solutions to assist you with this.
So dataset backup and archiving will enable you to set up, easy retention policies that will help you stay on top of these requirements adhering to the destruction requirement that you're subject to.
So in summary, a DLM process is the concept of managing data from creation right through to destruction with one option being a five stage process that boils down to creation, And as we discussed earlier, dataset supports customers with stage two and three of their DLM plan with the data backup and recovery along with sandbox seeding.
Stage four with archival, and stage five, the ultimate purging of data with retention policies within these.
The data dashboard as well that we briefly saw earlier, so that enables you to understand and predict how your storage is being used and is growing over time.
So that brings us to the end of what we were aiming to cover today.
However, I know there's might be a couple of questions, thoughts, or concerns in the chat. But, also, I should say, we've mentioned a few times this is the first part of a three part series. There's gonna be lots of extra additional information and insights coming over the next two. So the next one is around strategies for safeguarding your Salesforce org, November the twenty ninth, same time.
So we'd love to see you there. But while we wait, before we before we drop, just gonna open the floor for any questions.
Hopefully, there's a q and a function that folks can see.
Hey, Lawrence. Yes. There is one question in the in the q and a. It is, how can Gearset work with us further, to ensure our data data life cycle strategy is robust outside of, the webinars.
Got it. Okay. No. Well, thanks a lot. And so how can we support you ensuring your sort of holistic data approach is robust?
Yeah. Of course. So for any guest customer, you will have what we call a customer success manager.
They were they are there to support you and ensure that we help you meet your goals. So they're there as one resource for you.
Additionally, for any specific questions regarding data products that you may have with Gearset, there's not only the, support team who you may, as existing customers will have access to or trial customers have access to, But also the, the, the sales and the technical team here in the presales phase, we're always happy to support you with not only trying to understand and guide you on that journey, but also any Gearset specific questions too.
Additionally, for for those of our our larger teams, you have access to our DevOps architect teams as well, and they are dedicated industry veterans who will help establish these processes for you. So there's a whole suite of teams that support all of our customers. And, additionally, I forgot to mention as well for for many of our teams, you will have a dedicated onboarding manager as well as part of your Gearset journey. So that will help you very much in terms of the getting up to speed with your Gearset purchasing and subscriptions after that purchasing stage through implementation.
So they'll definitely be able to support you with establishing those process as well.
So I appreciate that, sir. There's quite a few teams there to assist, but, I think the long story short is there's lots of folks here.
Any more, Serena?
That's it for now. Yep.
So for now, I've seen there's there's some comments rather than questions.
Good to see some people excited about the data dashboard.
Yeah, included for backup customers. That's all good. Okay. Fantastic. So we're on half past six or half past or maybe it's on the airline, depending on where you're based, based on time zones.
Thanks everyone for joining on this Tuesday. It will be Tuesday for everyone on the on the call. So thank you very much indeed for everyone joining in. It's been great to, great to host you this afternoon or this morning or this evening.
Thank you so much. I'm looking forward to seeing you in two weeks' time.