Memory consumption - something is using all the available memory

I have a problem but with 7.3.12. I moved all the rowstore tables into a columnstore (was not the idea but I dont know what else to try)
And still everything is randomly failing due to memory usage, event a simple create table command.

I run this and I’m using only 7gb of memory (my cluster has 124 gb)

select DATABASE_NAME ,table_name, sum(memory_use)/(1024 1024 1024) m
from information_schema.table_statistics
group by table_name,DATABASE_NAME
order by 3 desc

I not sure what’s causing the memory consumption. The only different thing is that we created 50 databases with rowstore tables, but all of them are empty… Does it allocate memory as well?

Aborted_clients 290411
Aborted_connects 119820
Active_dedicated_admin_connections 0
Aggregator_id 1
Alloc_analyze 326.750 MB
Alloc_client_connection 3.625 MB
Alloc_compiled_unit_sections 23.383 (+4.602) MB
Alloc_databases_list_entry 1.625 MB
Alloc_deleted_version 164.250 MB
Alloc_global_func 0.125 MB
Alloc_hash_buckets 329.311 MB
Alloc_internal_key_node 66.000 MB
Alloc_plan_cache 0.625 MB
Alloc_protocol_packet 47.375 (-1.625) MB
Alloc_replication 1.625 MB
Alloc_sharding_partitions 0.625 MB
Alloc_skiplist_tower 384.500 (+0.875) MB
Alloc_table_autostats 21.173 MB
Alloc_table_memory 1672.233 (+1.750) MB
Alloc_table_metadata_cache 1.500 MB
Alloc_table_primary 426.500 (+0.375) MB
Alloc_thread_stacks 194.000 MB
Alloc_unit_ifn_thunks 1.711 (+0.204) MB
Alloc_unit_images 4.846 (+0.916) MB
Alloc_variable 280.500 (+0.500) MB
Alloc_variable_allocated 265.6 MB
Alloc_variable_bucket_104 allocs:880 alloc_MB:0.1 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_1168 allocs:52 alloc_MB:0.1 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_11896 allocs:21 alloc_MB:0.2 buffer_MB:0.2 cached_buffer_MB:0.0
Alloc_variable_bucket_128 allocs:1779588 alloc_MB:217.2 buffer_MB:224.5 cached_buffer_MB:0.0
Alloc_variable_bucket_130960 allocs:0 alloc_MB:0.0 buffer_MB:1.9 cached_buffer_MB:1.9
Alloc_variable_bucket_14544 allocs:4 alloc_MB:0.1 buffer_MB:0.4 cached_buffer_MB:0.2
Alloc_variable_bucket_1480 allocs:91 alloc_MB:0.1 buffer_MB:0.2 cached_buffer_MB:0.0
Alloc_variable_bucket_16 allocs:41999 alloc_MB:0.6 buffer_MB:0.8 cached_buffer_MB:0.0
Alloc_variable_bucket_160 allocs:1087 alloc_MB:0.2 buffer_MB:0.2 cached_buffer_MB:0.0
Alloc_variable_bucket_1832 allocs:21 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_18696 allocs:15 alloc_MB:0.3 buffer_MB:2.1 cached_buffer_MB:1.8
Alloc_variable_bucket_200 allocs:42 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_21816 allocs:0 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.1
Alloc_variable_bucket_2288 allocs:21 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_24 allocs:1809226 alloc_MB:41.4 buffer_MB:41.9 cached_buffer_MB:0.0
Alloc_variable_bucket_248 allocs:118 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_26184 allocs:0 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.1
Alloc_variable_bucket_2832 allocs:49 alloc_MB:0.1 buffer_MB:0.2 cached_buffer_MB:0.0
Alloc_variable_bucket_312 allocs:1969 alloc_MB:0.6 buffer_MB:0.6 cached_buffer_MB:0.0
Alloc_variable_bucket_32 allocs:17209 alloc_MB:0.5 buffer_MB:0.6 cached_buffer_MB:0.0
Alloc_variable_bucket_32728 allocs:0 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.1
Alloc_variable_bucket_3528 allocs:11 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_384 allocs:55 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_40 allocs:1902 alloc_MB:0.1 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_43648 allocs:0 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.1
Alloc_variable_bucket_4504 allocs:12 alloc_MB:0.1 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_48 allocs:676 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_480 allocs:22 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_56 allocs:1562 alloc_MB:0.1 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_5680 allocs:30 alloc_MB:0.2 buffer_MB:0.2 cached_buffer_MB:0.0
Alloc_variable_bucket_600 allocs:237 alloc_MB:0.1 buffer_MB:0.2 cached_buffer_MB:0.0
Alloc_variable_bucket_6224 allocs:42 alloc_MB:0.2 buffer_MB:0.4 cached_buffer_MB:0.1
Alloc_variable_bucket_64 allocs:1622 alloc_MB:0.1 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_65472 allocs:0 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.1
Alloc_variable_bucket_72 allocs:3601 alloc_MB:0.2 buffer_MB:0.4 cached_buffer_MB:0.0
Alloc_variable_bucket_7264 allocs:69 alloc_MB:0.5 buffer_MB:0.5 cached_buffer_MB:0.0
Alloc_variable_bucket_752 allocs:42 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_80 allocs:1516 alloc_MB:0.1 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_88 allocs:563 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_bucket_9344 allocs:230 alloc_MB:2.0 buffer_MB:2.1 cached_buffer_MB:0.0
Alloc_variable_bucket_936 allocs:60 alloc_MB:0.1 buffer_MB:0.1 cached_buffer_MB:0.0
Alloc_variable_cached_buffers 4.6 MB
Alloc_warnings 4.875 MB
Auto_attach_remaining_seconds 0
Average_garbage_collection_duration 34 ms
Buffer_manager_cached_memory 2046.2 (+6.5) MB
Buffer_manager_memory 3762.9 (+6.6) MB
Buffer_manager_unrecycled_memory 6.1 MB
Bytes_received 5455422937201
Bytes_sent 3489911125073
Columnstore_ingest_management_active_queries 0
Columnstore_ingest_management_estimated_memory 0.000 MB
Columnstore_ingest_management_estimated_segments_to_flush 0
Columnstore_ingest_management_max_concurrency 0
Columnstore_ingest_management_queued_queries 0
Connections 978906
Context_switch_misses 4307537
Context_switches 5098251
Disk_space_reserved_for_secondary_index 0
Execution_time_of_reads 1065018166 ms
Execution_time_of_write 1828144188 ms
Failed_read_queries 4409
Failed_write_queries 38434
Free_io_pool_memory 0.2 MB
GCed_versions_last_sweep 0
Idle_queue 0
Inflight_async_compilations 0
Ingest_errors_disk_space_use 2762284 Bytes
License_capacity 4 units
License_expiration 0
License_key d44f474d16c842b3ab38554cb5dc37a5
License_type free
License_version 4
Linux_resident_memory 6565.442 (+104.227) MB
Linux_resident_shared_memory 14.481 (-4.636) MB
Malloc_active_memory 639.214 (-0.140) MB
Malloc_transaction_cached_memory 203.891 MB
Max_used_connections 193
Maximum_cluster_capacity 4 units
Prepared_stmt_count 15
Queries 915800993
Query_compilation_failures 0
Query_compilations 3138
Questions 915800993
Ready_queue 0
Row_lock_wait_time 0 ms
Rows_affected_by_writes 13177963267
Rows_returned_by_reads 19002654302
Seconds_until_expiration -1
Ssl_accept_renegotiates 0
Ssl_accepts 0
Ssl_callback_cache_hits 0
Ssl_cipher
Ssl_cipher_list
Ssl_client_connects 0
Ssl_connect_renegotiates 0
Ssl_ctx_verify_depth 18446744073709551615
Ssl_ctx_verify_mode 0
Ssl_default_timeout 0
Ssl_finished_accepts 0
Ssl_finished_connects 0
Ssl_session_cache_hits 0
Ssl_session_cache_misses 0
Ssl_session_cache_mode SERVER
Ssl_session_cache_overflows 0
Ssl_session_cache_size 20480
Ssl_session_cache_timeouts 0
Ssl_sessions_reused 0
Ssl_used_session_cache_entries 0
Ssl_verify_depth 0
Ssl_verify_mode 0
Ssl_version
Successful_read_queries 290015106
Successful_write_queries 622224525
Threads_background 1
Threads_cached 187
Threads_connected 380
Threads_created 193
Threads_idle 374
Threads_running 1
Threads_shutdown 0
Threads_waiting_for_disk_space 0
Total_blobs_processed_for_fsync 0
Total_blobs_submitted_for_fsync 0
Total_dedicated_admin_connections 0
Total_io_pool_memory 13.4 MB
Total_server_memory 4989.9 (-0.1) MB
Transaction_buffer_wait_time 0 ms
Transaction_log_flush_wait_time 656 ms
Used_cluster_capacity 4 units
Used_instance_license_units 0
Workload_management_active_connections 0
Workload_management_active_queries 0
Workload_management_active_threads 0
Workload_management_queued_queries 0

Memory used by MemSQL (20280.25 Mb) has reached the ‘maximum_memory’ setting (20480 Mb) on this node. Possible causes include (1) available query execution memory has been used up for table memory (in use table memory: 12998.38 Mb) and (2) the query is large and complex and requires more query executi
Callstack:

and sometimes the error is in use table memory: 0 Mb

1 Like

I don’t have specific advice to offer, but here’s good information in our documentation regarding how to deal with out-of-memory conditions.

hi @hanson thank you, I already read all the documentation but something is not ok with this db.

For instance, I’m getting this error:
available query execution memory has been used up for table memory (in use table memory: 17386.75 Mb)

But in show status extended it shows: Alloc_table_memory 2885.353 MB

I’m querying that during the execution of the query, and it doesn’t change. Something is consuming the table memory and looks like Singlestore doesn’t know

Hi @moralez.rodrigo, @hanson,

I’m running into a similar issue here, and still waiting for a comprehensive answer.

Thanks

It’s crazy, I have 4x32gb servers, only 1 query running, not so complicated, and still getting out of memory. I move every table to columnar to save memory and still the same

I have many (not so many) rowstore empty tables and empty databases, does it reserve memory?

@hanson any idea? we store json fields on columnar table, does it consume memory?

hi. Have you tried opening a ticket with support? They always ask for cluster support and tell you immediately what resources are being used, and how, based on that.

There is something defiantly wrong with Singlestore. I dont think their database is optimized enough. We have S00, and it is not able to execute one insert query where size if 4 mb and there is no other load running on the machine even having 1 GB of available memory.

My worries is that even if we plan to upgrade but looks like ppl are even facing same with 4x32gb servers.

This is something Singlestore team needs to improve on.

@moralez.rodrigo Did you find anything? how did it resolve for you?

Hi @bhaskar, if you can give more details about how to reproduce the problem, maybe somebody can offer help. If you’re a paying customer, you can also open a support ticket if you need to.

Thanks hanson, but this helped me

Go to the Processes tab and sort by Memory to see which applications are consuming the most memory.

Hello @hanson ! I updgraded to version 8.5.9 but I’m still facing memory issues I cannot identify.
According to studio my cluster is using almost all the memory all the time.

but the memory usage of the tables is not that big.
image

We don’t have HA enabled so that’s no the reason.

I was checking the show status extended and I see these keys:

Total_server_memory 7045.5 (-0.1) MB
Linux_resident_memory 7569.848 (+61.836) MB
Buffer_manager_memory 6608.8 MB
Alloc_variable 2242.750 (+1.375) MB
Alloc_variable_allocated 2170.9 MB
Alloc_table_primary 2456.375 MB
Alloc_table_memory 6177.861 MB

Is there something that can be done to release memory?