Want to see demo?    

← Back to Resources

How to Get Started with Salesforce’s Free Data Cloud Credits

Reading time: 15 min   |  By Mehmet Orun   |  Published in Articles,

And How to Avoid Wasting Them, Too!

Salesforce’s decision to offer free credits for Data Cloud is a brilliant move to boost sales and adoption of Data Cloud. The challenge so far has been a lack of guidance for implementation. Customers are uncertain how to start using their free Data Service Credits.

What would be an impactful use case for the free credits? How do you evaluate the technical viability of Data Cloud? In particular, customers are justifiably concerned about wasting these credits or going over the free allotment before seeing results.

In this blog, I’ll explain how to take advantage of Data Cloud’s free credits to build a business case for Data Cloud that addresses the key concerns of Business, IT, and Finance stakeholders. We’ll also dive into how Data Cloud’s credit pricing works and share strategies to minimize credit waste.

I believe these guidelines offer a safe, effective way to determine the value and technical viability of Data Cloud. I look forward to your questions, feedback, and success stories on how this works for you.

What Does Free Data Cloud Really Mean?

At Dreamforce ‘23 Salesforce announced Free Data Cloud for Enterprise Edition and above customers, enabling them to “unify 10,000 profiles at no cost”. The fine print here is important—the ability to unify 10,000 profiles is an approximation. Data Service Credits are the primary entitlement and pricing mechanism for Data Cloud. What Salesforce is really giving customers is an annual allotment of 250,000 Data Service Credits (2,500,000 for Unlimited Plus edition), prorated against a Core contract.

Think of Data Service Credits as “power” for Data Cloud usage. While we may have the same model phone, if you’ve installed a bunch of power-hungry applications your battery will die faster than mine. In the same way more demanding orgs will consume more battery (credits) than other orgs. It’s up to each customer to estimate how much battery power they’ll need.

In the same way you can control push notifications to minimize battery usage, there are strategies for reducing Data Cloud credit consumption. Successfully connecting the right data and implementing the appropriate design can ensure you use the 250K credits effectively. But not having a plan may result in squandering the credits with little to show for your efforts.

How are Salesforce Data Service Credits Calculated and Consumed?

As a usage-based product, Salesforce Data Cloud consumes Data Service Credits. The more complicated the task, the more credits you will need. The amount of data processed also increases the credit cost.

For example, querying data costs 2 credits per million records. However, batch-transforming a million rows to remove unwanted content requires 2,000 credits. Streaming the same transformation costs 5,000 credits. Applying transformation rules to get a consistent view of two million rows across different sources will cost 4,000 credits for batch processing.

Such a model may raise concerns, but it should not be scary. Your organization probably processes data in public clouds (AWS, GCP, Azure, etc.), which also utilize a credit-based model. The difference? Salesforce is transparent about the costs and complexity of integration and processing, a shift from the past where these expenses were concealed.

The Data Cloud rate card (current as of publishing) details how Data Service Credits are consumed.

Data Services Credits rate card for Salesforce Data Cloud

Use Free Credits to Make a Business Case for Data Cloud

To drive support for Data Cloud, you need to persuade three key stakeholders—business, IT, and finance. Each of these stakeholders have different concerns and understandings of Customer 360 solutions.

  • When talking with a business leader, I focus on the lost business opportunity of disconnected data.
  • For CFOs, I highlight the cost benefit of consolidating an end-to-end Customer 360 solution on a single platform. Hidden costs often span across various license, implementation, and headcount line items.
  • With IT Leaders, I underscore technical viability and security. I call attention to how Data Cloud eliminates the technical complexity of evaluating, implementing, and integrating various other technologies. This takes time and effort and there is no guarantee the technologies will work together seamlessly.

In my experience, leveraging a proof of concept (PoC) is the ideal approach to facilitating these conversations. Thankfully, the free Data Cloud credits makes a PoC a low-risk proposition.

Start with a Data Cloud Proof of Concept

There are three distinct steps to planning your Data Cloud PoC. If done correctly, these steps will address the needs of your stakeholders in business, IT, and finance:

  1. Select an impactful, yet representative use case
  2. Identify the right data (sources, records, fields) to demonstrate the value of unifying data
  3. Deliver a functional demo with quantified benefits and implementation costs

Select an Impactful Use Case

Data Cloud can solve many interesting challenges, but there are only three drivers for business investments:

  • Increase revenue or time to revenue
  • Decrease costs
  • Ensure compliance (which can negatively impact the above)

Heuristically, I have found the most success by pursuing lost sales use cases. For B2B customers this means showing where an incomplete picture of the customer translates to lost sales opportunities. In B2C contexts this translates to highlighting a loss of customers or a higher cost of sale due to disparate data.

Your use case description should be simple and relatable. For example:

  • B2B: We will assess our global customer base across 3 lines of business to discover potential cross-sell or up-sell opportunities.
  • B2C: We will determine our true unique customer count and segment customers by revenue so we can offer white-glove service to our best customers inclusive of guest or gift orders.

Want a deep dive on crafting a Data Cloud business case with free Data Services Credits?

Watch our on-demand webinar on identifying Data Cloud use cases and maximizing your free Data Cloud credits.

Identify the Right Data to Support a Data Cloud PoC

I believe an effective Data Cloud PoC has 3 data sources and 3-5 source objects per data source. This provides sufficient data to demonstrate insights while keeping the work required for the PoC reasonable. Remember, the goal is to help your stakeholders assess the value, viability, and potential cost of Data Cloud. A targeted approach with adequate, yet constrained data sources will allow you to stay focused and go fast.

The best data sources for your PoC will demonstrate the business benefit and technical viability of Data Cloud. The table below illustrates common starting points.

Company Type Data Sources
Target 3
Data Source Objects
Target 3-5 objects
B2B Multiple Salesforce orgs (primary org + two others) Leads, Accounts, Contacts, and Opportunities
B2B Multiple CRMs (Hubspot, Salesforce) from recent acquisition Company, Contact, and Opportunity (HubSpot)
Account, Contact, Lead, and Opportunity (Salesforce)
B2C Multiple Salesforce orgs for different lines of business Contacts or Person Accounts, Cases, Orders (e.g. if using Order Management)
B2C Multiple Salesforce clouds or alternate technologies, e.g. Service, Commerce, Marketing Contacts, Cases,
Customers, Orders, and

Consider Data Volume to Minimize Credit Consumption When Connecting Data Sources

Now that you understand what data you’ll be bringing into Data Cloud, it’s time to think about the size of that data. Customers with lots of data will need to be particularly mindful of the free Data Service Credits. It is easy to use up or go over your free credits if you’re not careful.

First, consider how to ingest data into Data Cloud. Don’t even consider streaming data until you’ve proven the business value and technical viability of Data Cloud. A fixed data set allows you to have a steady snapshot of your data while proving value and viability. Fixed data sets are also easier to manage and will save on precious data credits. For comparison, it costs 3,000 more credits to stream a data pipeline than to batch it.

Next, look for ways to minimize the records you need to batch process. If your combined data source objects represent more than one million records, rerunning batch data processing will cost you at least 2,000 Data Service Credits a pop.

If you have more than one million records, I recommend working with a subset of your data. At this stage, the primary goal is to showcase the full potential and value of Data Cloud. Attempting to bring in all data will delay progress and could exhaust your free credits.

The most impactful data to unify is that of customers with whom you transact the most today. Unifying high tier customers will uncover the most missed sales opportunities. It will also prove the solution’s effectiveness across all of your data.

The approach to identifying high-tier customers varies between B2B and B2C companies.

  • For B2C, focus on the most active Contacts (e.g. unique email addresses) or segment data by region.
  • For B2B, target companies with the most Accounts or Opportunities. Ensure a comprehensive view by using a third-party global identifier to find hidden account relationships in your data.

Data profiling solutions, such as Cuneiform for Data Cloud, will accelerate your data assessment and minimize credit usage.

Cuneiform for Salesforce data profiling identifies high tier customers
Screenshot: Cuneiform for CRM Data Profiling solution

Identify a Subset of Fields to Accelerate Your Data Cloud PoC

The fewer fields you have to incorporate in your design, the faster you can deliver your PoC results. Unfortunately, it is not always clear which fields matter, and the credit cost of trial and error is high.

In Salesforce orgs older than five years, an object can have an average of 200-500 fields and 15%-25% of custom fields may not be actively in use.

Salesforce data profiling field fill rate distribution

To determine what fields matter I use net fill-rate—the percentage of fields populated with more than one distinct value. My rule of thumb is any field with a 50% or greater net fill-rate is likely meaningful to your stakeholders. Use data profiling to understand field net fill-rate.

If you’ve ever tried profiling 3-5 objects with hundreds of fields each, you may be overwhelmed with the perceived effort and complexity of this data unification initiative. Yes, you can query the objects using DBeaver and capture the results in shared spreadsheets. This will likely take you a minimum of four weeks. Often external query tools also consume credits in an inefficient manner due to suboptimal queries and repeated processing.

Conversely, native Salesforce data profiling tools can achieve the same results in as little as one or two days. In addition to time savings, native data profiling solutions offer rich insights out of the box and save on credit usage too. The results are saved for use by authorized users within Data Cloud, meaning you can simply come back to access the insights and avoid credit waste.

Create reusable data profiles to avoid Data Cloud credit waste

Prevent Data Cloud Credit Waste with the Correct Design

Having selected representative data sources, records, and fields, it’s time to get down to business. Apply data profiling insights to prevent design errors that could compromise results and to showcase Data Cloud’s ability to overcome historical data quality challenges. This ensures a dual benefit: preventing credit waste from unnecessary data reprocessing and highlighting the solution’s capability to address data quality issues that have hindered legacy solution architectures.

It is time to:

  1. Design your Data Model Object (DMO) schemas
  2. Make Data Governance decisions on what data standards will ensure data consistency in your Data Lake Objects (DLOs)
  3. Map your source data in DLOs to the target DMO schema. Make sure to resolve data differences.
  4. Identify bad data that could distort results and cast doubt on your Data Cloud PoC. Some common examples include likely fake unique identifiers, like dummy emails and phone numbers
    Bad data wastes free Data Cloud credits
  5. Execute and verify your outputs through Harmonization

Your final step is to deliver a functional demo people can relate to and quantify benefits with implementation costs

Demonstrate The Business Benefit of Data Cloud

At this point you have a functional demo of Data Cloud. You’ve reconciled data across different sources and schemas to identify lost sales opportunities. It’s time to build the business case.

When brought into customer conversations at Salesforce, I relied on a simple presentation flow:

  • Remind your stakeholders of the use case’s business objective.
  • Using one example customer, show the real, quantified scope of disconnected records before unification. Then highlight the insights into potential sales opportunities Data Cloud identified.
  • Use your Data Cloud PoC to show how end users will benefit from having unified insights at their fingertips.
  • Highlight the challenges you tackled, how long it took to complete the PoC, and what you expect a more broad rollout to require.
  • Ask for additional investment. Show with confidence you can make this real for your organization (or if you are a consultant, for your client). Show how your approach will scale.

Join us on Feb 13th for an insightful webinar: “Unlock the Full Potential of Salesforce’s Free Data Cloud Licenses.” Discover actionable tips on identifying impactful Data Cloud use cases, maximizing your free credits, and building a compelling business case for Data Cloud.

By Mehmet Orun

Mehmet has been on a quest to achieve Customer 360 with Salesforce for more than 18 years. As a Salesforce customer, he architected and oversaw the Master Data Management program for Salesforce’s first Life Science customer. During his 10+ year career at Salesforce, Mehmet served as the first Director of Data Quality for the Data.com product team, built and led Salesforce’s Data Strategy Practice, wrote requirement 0 for Salesforce Customer 360, and lead the C360 Data Manager product until its integration with C360 Audiences (aka Marketing CDP) which we now know as Data Cloud. See full leadership profile.


Want to see our platform in action?

By leveraging the Cuneiform Platform, you can obtain and use more accurate, data-driven insights through effective data quality monitoring. Learn more about how we can help you with your important tasks.