Note that the query above has directional quotes “ / ”, but we don’t accept those as JSON string delimiters. However, I assume that’s a copy-paste issue. It’s probably not the only one, unfortunately, since that query succeeds on my end if I replace the directional quotes with " and '. Try uploading as preformatted text?
My guess is that the issue is with the name field. MemSQL doesn’t currently support the full unicode range inside JSON strings (SingleStoreDB Cloud · SingleStore Documentation). Does the failing query succeed if you remove the “name” field from the input JSON?
If so, there’s a workaround available via encoding unsupported characters as “\u escape sequences”. We’re also currently working on adding built-in support for an upcoming release.
insert into test_json_insert (raw_json) values (
‘{“ts”:“2020-07-10T09:55:40Z”,“s_session_id”:“”,“version”:“1.0.10.1”,“session”:“fdsdfs-c06f-4f62-833b-0e3b730bbeba”,“body”:{“type”:“game”,“name”:“àØãàðà àðÿ “,“id”:“95444”,“session”:“37f5ea28-720b-454c-a091-9bb772dc8814”,“room_id”:””,“number”:1}}’
)
I’m sorry, here is again. Thanks for your response. Your guess is right, the characters in the name field are causing the issue.
How Can I escape those fields? I’m ingesting this data via pipelines.
@sasha I’m trying to apply a jq transformation to solve this issue but looks like the transformer do not works
I’m adding the following to the transformation, but the output is an empty row and it fails.
It works fine and it does escape the characters using jq from command line. Am I missing something?
WITH TRANSFORM ('memsql://json','', ' -a "."')
Here’s my test from command line:
echo '{"test":"test","body":{"name":"raff obbyàíï"}}' | jq -a "."