Skip to content

added reduce function to btrdb Stream and BTrDB to make objects pickl…#69

Closed
aorticweb wants to merge 16 commits intodevelopfrom
pickle-stream
Closed

added reduce function to btrdb Stream and BTrDB to make objects pickl…#69
aorticweb wants to merge 16 commits intodevelopfrom
pickle-stream

Conversation

@aorticweb
Copy link
Copy Markdown
Contributor

  • Added reduce function to btrdb Stream and BTrDB to make objects pickle-able. (had to modify their init functions)

@aorticweb aorticweb requested a review from looselycoupled June 17, 2020 18:58
@looselycoupled
Copy link
Copy Markdown
Member

Hi @aorticweb, can we get some tests included for this PR before I review? We'll also need information added to the documentation so people understand how to use this. Feel free to ping me on slack!

@looselycoupled
Copy link
Copy Markdown
Member

Same issue with this PR. We only merge feature branches into the develop branch. Then we do a release commit to merge into master where we update the software version, etc. Can you change this PR to point to develop?

@aorticweb aorticweb changed the base branch from master to develop June 19, 2020 14:51
@pytest.fixture(scope="session")
def pickle_folder(tmpdir_factory):
fn = tmpdir_factory.mktemp("data")
return fn
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not convinced this is actually used anywhere...

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yep I removed it.

self.conn_params = conn_params

def __reduce__(self):
return(self.__class__, (self.conn_params,))
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I read this to mean that we are essentially serializing the user's apikey into the binary file right? If that's the case then we may want to rethink how to handle that. Otherwise, anyone with access to the pickled object now has the user's api key. Ideally we'd want to re-inflate the stream with the "current" user's credentials.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see how this is a security issue, I will edit the file to rebuild from the the initial connection informations

creds = credentials(conn_str, apikey)
if "endpoints" in creds:
return _connect(**creds)
return _connect(**creds, conn_params={"conn_str": conn_str})
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I'm reading this correctly... conn_params does not include the apikey because we wouldnt want to have the apikey saved... but there might be a failure mode if they do supply the apikey and then the object gets unpickled and the ENV vars do not exist?

Seems like a thorny problem. I think we probably need to do an in-person code review so I understand all the implications.

@looselycoupled
Copy link
Copy Markdown
Member

@aorticweb This is a pretty complex problem and I think we need to setup a meeting to review this code.

@looselycoupled
Copy link
Copy Markdown
Member

Closing this PR as we've decided not to allow pickle for the time being.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants