Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor: fixes for futures #3363

Merged
merged 2 commits into from
Nov 17, 2019
Merged

Conversation

bartlomieju
Copy link
Member

@bartlomieju bartlomieju commented Nov 17, 2019

This PR is a followup to #3358
After landing it benchmarks exploded indicating problems with workers and deno_core_http_bench.

The bench is already fixed and workers partially.

EDIT: This PR dramatically fixes churn in thread/syscall count that showed up on benchmarks. Thread count is not back to previous levels but difference went from hundreds/thousands to about ~50.

@bartlomieju
Copy link
Member Author

deno/cli/ops/workers.rs

Lines 161 to 173 in 2b3afda

// Has provided source code, execute immediately.
if has_source_code {
js_check(worker.execute(&source_code));
return Ok(JsonOp::Sync(exec_cb(worker)));
}
let op = worker
.execute_mod_async(&module_specifier, None, false)
.and_then(move |()| futures::future::ok(exec_cb(worker)));
let result = tokio_util::block_on(op.boxed())?;
Ok(JsonOp::Sync(result))
}

It should use futures::executor::block_on instead of tokio_util::block_on but I'm getting this error:

thread 'tokio-runtime-worker-0' panicked at 'cannot execute `LocalPool` executor from within another executor: EnterError', src/libcore/result.rs:1165:5
stack backtrace:
   0: <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt
   1: core::fmt::write
   2: std::io::Write::write_fmt
   3: std::panicking::default_hook::{{closure}}
   4: std::panicking::default_hook
   5: std::panicking::rust_panic_with_hook
   6: std::panicking::continue_panic_fmt
   7: rust_begin_unwind
   8: core::panicking::panic_fmt
   9: core::result::unwrap_failed
  10: core::result::Result<T,E>::expect
  11: futures_executor::local_pool::run_executor
  12: futures_executor::local_pool::block_on
  13: deno_cli::ops::workers::op_worker_post_message
  14: core::ops::function::Fn::call
  15: deno_cli::state::ThreadSafeState::stateful_op::{{closure}}
  16: deno_cli::ops::dispatch_json::json_op::{{closure}}::{{closure}}
  17: core::result::Result<T,E>::and_then
  18: deno_cli::ops::dispatch_json::json_op::{{closure}}
  19: deno_cli::state::ThreadSafeState::core_op::{{closure}}
  20: <alloc::boxed::Box<F> as core::ops::function::Fn<A>>::call
  21: deno::ops::OpRegistry::call
  22: deno::isolate::Isolate::pre_dispatch
  23: _ZN4deno4SendERKN2v820FunctionCallbackInfoINS0_5ValueEEE
  24: _ZN2v88internal25FunctionCallbackArguments4CallENS0_15CallHandlerInfoE
  25: _ZN2v88internal12_GLOBAL__N_119HandleApiCallHelperILb0EEENS0_11MaybeHandleINS0_6ObjectEEEPNS0_7IsolateENS0_6HandleINS0_10HeapObjectEEESA_NS8_INS0_20FunctionTemplateInfoEEENS8_IS4_EENS0_16BuiltinArgumentsE
  26: _ZN2v88internalL26Builtin_Impl_HandleApiCallENS0_16BuiltinArgumentsEPNS0_7IsolateE
  27: _ZN2v88internal21Builtin_HandleApiCallEiPmPNS0_7IsolateE
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
fatal runtime error: failed to initiate panic, error 5
Abort trap: 6

Copy link
Member

@ry ry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM - thanks - I agree we just need to iterate on this a bit.

@ry ry merged commit f356b2b into denoland:master Nov 17, 2019
@bartlomieju bartlomieju deleted the refactor-fixes_for_futures branch November 17, 2019 13:15
bartlomieju added a commit to bartlomieju/deno that referenced this pull request Dec 28, 2019
After landing denoland#3358 the benchmarks exploded indicating problems with workers and deno_core_http_bench.

This PR dramatically fixes thread/syscall count that showed up on benchmarks. Thread count is not back to previous levels but difference went from hundreds/thousands to about ~50.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants