-
Notifications
You must be signed in to change notification settings - Fork 527
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] hugeclient批量数据写入,提示:java.lang.illegalArgumentException: the max length of bytes is 65535, but get 337492. #2291
Comments
@DanGuge could take a look for it? Better to make the vertex/edge ID length configurable |
I will check this later |
Thanks for your reply, I had resolved this problem last night [憨笑] . The reason why failed to write record to hugegraph server is the length of Text(String) type was limit to 65535. After check my data rows, found that one property value' length of few rows is over that 65535, the writing operation was success after limit property length. Below is the full logic to check property value length : ![]() ![]() |
close my issue, the length Text property value should less than 65535. |
we know the limitation, and consider add an option for user to modify it (so as the length of vertex/edge ID) |
Excuse me, is there a PR solution to this issue |
thanks for the reminder,address it again |
I want to implement this function, can it be assigned to me |
@LiJie20190102 If u want to config property length, we could reopen this issue & link PR to it |
@LiJie20190102 I currently don't create PR to the repository to fix this problem, "extract the limitation to a configuration" is the suggesstion from database developers. |
I want to do both |
I think a good solution is not to limit the length of vertex/edge ID or attributes, but to dynamically change the attribute values of org.apache.hugegraph.backend.serializer.BytesBuffer, such as UINT8_ MAX, because in business use, sometimes the length of vertex IDs does need to be this long. In this case, we should not limit their length, but rather modify the limit to accept the corresponding value,WDYT @imbajin @dongma |
@dosu-bot What's your suggestion/analysis? |
🤖 Hey @imbajin, great to see you back with another insightful query! Hope you're doing well. Thank you for your detailed explanation and suggestion. However, allowing users to customize configurations to dynamically change the attribute values of The constants defined in Changing these values dynamically could lead to unpredictable behavior, data corruption, or even system crashes. For example, if you increase the Moreover, these constants are also used to allocate memory for buffers. Dynamically changing these values could lead to inefficient memory usage or even out-of-memory errors if the values are set too high. Therefore, it's not recommended to allow users to customize these configurations dynamically. If you need to handle larger IDs or attributes, it would be better to consider changing the data model or using a different data type that can accommodate larger values. If you still want to proceed with this, it would require a significant amount of testing to ensure the system behaves correctly under different configurations. It's also important to provide clear documentation and warnings to users about the potential risks of changing these values. As for the role of I hope this helps. If you have any other questions or need further clarification, feel free to ask. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
…ncode/decode string id length (#2622) As title, change limit: - vid max to 16KB - eid max to 64kb (128k as backup) - property max to 10MB (keep consistent) fix #1593 #2291 --------- Co-authored-by: imbajin <[email protected]>
Bug Type (问题类型)
rest-api (结果不合预期)
Before submit
Environment (环境信息)
图库版本:1.0.0版本
存储引擎:HBase
数据量:点1亿+,边数量1亿+。点已正常写入,边上有属性字段较多,一批次写入1000条左右。
期望,请求体大小大于95535,看图库服务端是否参数可调整(限制请求体大小的参数为?)
Expected & Actual behavior (期望与实际表现)
hugeclient批量数据写入,一批次request body请求次数大于服务端默认的65535大小,导致关系数据无法写入。
异常堆栈如下:
Vertex/Edge example (问题点 / 边数据举例)
No response
Schema [VertexLabel, EdgeLabel, IndexLabel] (元数据结构)
No response
The text was updated successfully, but these errors were encountered: