What the SingleStore Data Intensity Calculator Tells Us About Enterprise Application Infrastructure Requirements — Today and In the Future

Clock Icon

6 min read

Pencil Icon

Sep 27, 2022

What the SingleStore Data Intensity Calculator Tells Us About Enterprise Application Infrastructure Requirements — Today and In the Future

SingleStore is announcing the data from the first-of-its-kind Data Intensity Calculator — already used by 125 companies. Find out what their results were in this blog.

Only a few months have passed since we introduced the groundbreaking Data Intensity Calculator. Already, 125 companies have used this online tool to measure their data intensity.

Half of the applications that these organizations tested are considered highly data intensive. And nearly 65% of these companies — which span industries including   finance, retail and tech — said  they expect their data will grow between 10% and 100% within the next year.

But before we get into the nitty gritty, let’s revisit what data intensity means, how we measure it and why understanding and addressing data intensity are so critical.

assessing-data-intensity-helps-you-better-understand-your-infrastructure-requirementsAssessing Data Intensity Helps You Better Understand Your Infrastructure Requirements

Data intensity measures the data requirements of an application. It’s important to get a handle on the data intensity of your applications. Only then will you know what infrastructure you need to enable the right level of application end-user experiences right now and going forward. 

We make gauging the data intensity of your applications easy. The calculator we launched in May allows you to assess your applications’ data intensity for free — and in just three minutes.

The SingleStore Data Intensity Index measures data intensity based on five considerations:

  • Concurrency; the application’s requirement to support a large number of users or concurrent queries, without sacrificing the SLA on query latency
  • Data size; the volume of the data sets needed to feed the application
  • Query complexity;  the extent to which the application must handle simple and complex queries
  • Data ingest speed; the application’s need to ingest high volumes of data at speed
  • Query latency; the amount of time it takes to execute a query and get results

Now that we’ve run through this data intensity refresher, let’s take a closer look at the findings.

applications-are-showing-significant-complexity-and-concurrency-growing-data-requirementsApplications Are Showing Significant Complexity and Concurrency, Growing Data Requirements

The vast majority of applications assessed using the Data Intensity Calculator registered a high or medium level of complexity. Of the 124 applications tested, 106 fell into these categories. Thirty-four of those applications are considered highly complex, requiring six or more joins, and 72 required three to five joins. The remaining 18 applications needed just one or two joins.

Concurrency requirements of applications also came in on the high end. Just six of the 124 Data Intensity Calculator entries said the number of concurrent queries that their database typically handles was less than five queries. More than twice as many (14) said more than 1,000 queries. Additionally:

  • 19 of the total had five to 10 concurrent queries
  • 61 had 10 to 100 concurrent queries
  • 24 had 100 to 1,000 queries

Data size was another category in which the Data Intensity Calculator registered some of the highest application demands. Forty-one of the applications tested had a data size of 1 to 10 terabytes (TB). About ⅕  (23 out of 124 applications) employed 10-50 TB of data. Nine of the applications currently rely on 50 to 100 TB of data to get the job done. And, a dozen of the applications needed more than 100 TB of data to deliver on their intended experiences.

But that is just a fraction of the data that organizations expect their applications to need in the near future. Over the next 12 months:

  • 40 companies said their data will grow 10-30%
  • 30 organizations expect their data to increase 30-60%
  • 11 organizations anticipate a whipping growth of 60-100%
  • 10 companies say their data is poised to exceed 100% data growth  

businesses-must-also-support-their-applications-data-ingest-speeds-and-latency-control-needsBusinesses Must Also Support TheirApplications’ Data Ingest Speeds and Latency Control Needs

When it comes to data ingest:

  • 28 of 124 of the applications required less than 1,000 rows per second
  • 40 of the applications needed 1,000 to 10,000 rows per second ingest rates
  • 33 demanded 10,000 to 100,000 rows per second ingest performance. About half of those (16) needed 100,000 to 1 million rows per second ingest,
  • Seven required more than 1 million rows per second.

In terms of latency, the largest group of applications (44) needed to keep delays between 100 milliseconds and 1 second. Thirty-two applications had more stringent latency requirements, between 10 and 100 milliseconds. Eight applications needed latency of 10 milliseconds or less.

Controlling latency of data-intensive applications is key to delivering the real-time experiences that customers now expect. But, as IDC notes, legacy systems that rely on batch-based processing don’t sync in real time — these processes are more like sending a letter than a text.

Data delays put business at risk by adversely impacting customer experience, leading to customer annoyance, churn and lost revenue, the global market intelligence firm says. IDC explains that real time is imperative for enterprise intelligence and a better customer experience because business happens in real time; the best kind of enterprise intelligence allows organizations to make decisions based on the most current data; and streaming data use cases exist across all industries, from manufacturing to financial, and retail to healthcare.

data-intensive-applications-call-for-a-new-kind-of-data-infrastructureData-Intensive Applications Call for a New Kind of Data Infrastructure

The bottom line is that a fair share of today’s applications are already data intensive, and all signs suggest that the world will see a whole lot more of these applications in the future.  

IDC expects new data that is created, captured, replicated and consumed to more than double between now and 2026. McKinsey & Co. notes that “tech customers’ needs and expectations are rapidly evolving” in the wake of the pandemic-fueled digital transformation. And data-intensive applications have become the lifeblood of today’s most competitive businesses.

Growing data intensity and user expectations call for a new approach to data infrastructure. One that brings transactional and analytical capabilities into a single database that can handle rapidly growing volumes of data, supports concurrency, ingests data fast, is designed for both simple and complex queries, and can execute queries and provide results with minimal latency.

Legacy and speciality databases aren’t up to the task. SingleStoreDB is the one database that can do it all.

Want to learn how data intensive your own applications are? Try out our free Data Intensity Calculator.

Want to learn more about what infrastructure you need to deliver on the promise of better end-user experiences and enterprise intelligence? Schedule your time to chat with SingleStore engineers today. 


Share