You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been thinking a bit more about the code generation story. Do you think it might make sense to generate the code from the schema in the build.jl function and just save to disc as .jl files? One benefit I could see is that the package would probably load a lot faster because the code generation would only happen once at package install time, not at every package load time. It might also be easier to fully understand everything that gets generated? There is probably some downside, but I can't really think of any right now?
The text was updated successfully, but these errors were encountered:
I recently noticed that the slow load time of VegaLite was partly due to the using FileIO which takes about half the time ! Problem seems to be this one : JuliaIO/FileIO.jl#156 itself caused by an inference regression in Julia . Another symptom is a large precompiled .ji file : FileIO.ji is 7.6Mb on my desktop.
If this problem gets solved this will really reduce the pain of loading VegaLite.
If we decide to do something about slow load times anyway, something we could try (besides the codegen idea you pursued) is to build the rootSpec in build.jl, serialize it and save it to a file. This might improve things : using FileIO is roughly 50% of load time, include("schema_parsing.jl") another 25-30%.
I've been thinking a bit more about the code generation story. Do you think it might make sense to generate the code from the schema in the
build.jl
function and just save to disc as.jl
files? One benefit I could see is that the package would probably load a lot faster because the code generation would only happen once at package install time, not at every package load time. It might also be easier to fully understand everything that gets generated? There is probably some downside, but I can't really think of any right now?The text was updated successfully, but these errors were encountered: