Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

make hamilton heater shaker async #425

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

ben-ray
Copy link
Contributor

@ben-ray ben-ray commented Mar 12, 2025

tested with 8x hhs box

Comment on lines +55 to +63
async def _send_command(self, command: str, **kwargs):
assert len(command) == 2, "Command must be 2 characters long"

args = "".join([f"{key}{value}" for key, value in kwargs.items()])
await asyncio.to_thread(self.io.write, f"T{self.shaker_index}{command}id{str(self.command_id).zfill(4)}{args}".encode())

self.command_id = (self.command_id + 1) % 10_000
response = await asyncio.to_thread(self.io.read)
return response
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually i don't think the read or write themselves take so long and introducing multithreading leads to potential complications

i think it's better to do many small reads and asyncio.sleep in between

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good idea!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think it's better to do many small reads and asyncio.sleep in between

Shouldn't the read be awaitable? Such that a await self.io.read() should be sufficient? Otherwise each backend using the io layer needs to implement this logic again?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Better idea, will go with that

@ben-ray
Copy link
Contributor Author

ben-ray commented Mar 12, 2025

just found out this also breaks .decode on get_temperature

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants