-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bulk inserting is not working #15
Comments
it's intresting. I will write test for it. Thanks. |
Is there anything I can provide to help you reproduce? I tried both the docker image and binary on GNU/Linux (alpine). I also tried environment variable configuration on docker image. Result didn't change. |
Did you sent insert or select queries? if insert, to same table or different? |
Hey @nikepan, my code is like this: import requests
query = """
INSERT INTO table
(field1, field 2)
VALUES
('{field1}', '{field2}');
""".format(
field1=field1,
field2=field2,
)
requests.post(
url='http://clickhouse-bulk-url',
params={
'query': query,
},
) I have many workers on different servers/docker containers doing the same thing, my expectation was |
Yes, I catched bug. I dont know, how it works. I will publish new version soon. With all works fine and auto resend dumps. Thank you! |
Thanks, looking forward to the new release! Do you have a date in mind? |
It's because you send multi line query. At now clickhouse-bulk supports only single line queries. I try to fix in next release |
Oh interesting. Would it work if I just replace |
Yes, i checked it: |
Awesome! I can go with that workaround until you release the new version then, thanks a lot @nikepan! |
I think I did hit another possible bug. I converted my multiline inserts to single-line ones. And even I see this log
I still see a single row being inserted into the actual table. |
I need example of your request. Rows calculated by \n char |
Here is the code: import requests
query = "INSERT INTO data (id, date_time, item_id, is_successful, detail, time1, time2, time3, time4, time5, time6, time7) VALUES ('bf58a4af-0e56-4e07-91d9-bbf70917e0da', '2019-10-15 21:36:43', 10, 1, 'OK', 8, 49, 99, 99, 222, 0, 377);"
requests.post(
url='http://clickhouse-bulk-url',
params={
'query': query,
},
) |
A testable example: import requests
query = "INSERT INTO data (id, date_time, item_id, is_successful, detail, time1, time2, time3, time4, time5, time6, time7) VALUES ('bf58a4af-0e56-4e07-91d9-bbf70917e0da', '2019-10-15 21:36:43', 10, 1, 'OK', 8, 49, 99, 99, 222, 0, 377);"
for i in range(10):
requests.post(
url='http://clickhouse-bulk-url',
params={
'query': query,
},
) Here is the log:
And I see 1 row inserted into the database. |
you need remove ";" at end of line. It bug too) Thanks for report |
clickhouse-bulk for speed parse strings simply and not support many variations) |
No problem :) I will try without the semicolon and check again. Thanks. |
@nikepan, hello. It seems like we've faced this issue too. |
No, it does not fixed. I found some troubles for this changes. I will try fix it later, but not now. |
Ok, thanks, at least now we know status exactly :) |
Hello
I'm getting Thanks! |
Ok nailed it! |
Hi, first of all, thank your making clickhouse-bulk 💐
I am running with this config
Shouldn't this config collect and insert incoming requests in every 3 seconds, in bulk? I am watching logs and seeing every insert I send via HTTP (I am doing insert by python's
requests
) being processed immediately as I send. What am I doing wrong?The text was updated successfully, but these errors were encountered: