- 
                Notifications
    
You must be signed in to change notification settings  - Fork 24
 
Pure python request pipeline #429
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
bb29685    to
    3a6df83      
    Compare
  
    ec58e87    to
    1890159      
    Compare
  
    1890159    to
    0e83eeb      
    Compare
  
    There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this looks fine conceptually.
7ad2adf    to
    6f474c6      
    Compare
  
    dbae11e    to
    6dbca91      
    Compare
  
    0e83eeb    to
    5d189d2      
    Compare
  
    75e5630    to
    f00644a      
    Compare
  
    aa96919    to
    31a635c      
    Compare
  
    31a635c    to
    6aa97b3      
    Compare
  
    | 
           This is good to look at now. I'm working on the codegen changeover, which will bloat the pr size since so much is going to be deleted. I would prefer that to be a separate PR though.  | 
    
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor comment but otherwise I think looks alright.
6aa97b3    to
    588c3f2      
    Compare
  
    588c3f2    to
    59307ce      
    Compare
  
    This replaces the code generated request pipeline with the hand-written one.
59307ce    to
    a07082b      
    Compare
  
    9a777e0    to
    7e34bde      
    Compare
  
    There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I still need to pull this down and generate a client to see what the output delta is. My only main concern/question right now is how we're tracking behavioral/API changes for the larger refactor.
        
          
                codegen/core/src/main/java/software/amazon/smithy/python/codegen/HttpProtocolTestGenerator.java
          
            Show resolved
            Hide resolved
        
      | if (shape.hasTrait(StreamingTrait.class)) { | ||
| writeDeserializer("data_stream"); | ||
| } else { | ||
| writeDeserializer(shape); | ||
| } | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's unique with the data_stream here that shape isn't accounting for? Just that it's an unknown length/shape?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
read_blob expects bytes, read_data_stream expects a stream. As for writeDeserializer, when given a shape it uses the name of the shape type as the read method to dispatch to. When given a string, it just does read_{name}. We wouldn't want it to check streaming automatically since it should only apply to blobs.
This adds three top-level config options when sigv4 auth is enabled: * `aws_access_key_id` * `aws_secret_access_key` * `aws_session_token` These are used for static credentials. Names are identical to botocore.
This updates several dataclasses to avoid including schemas in their default `repr` so that they don't get logged without deliberate intent. Logging schemas can be very messy as they are huge.
This updates AsyncBytesReader to expose the underlying seek method of the data source, if it exists.
This updates the error deserializer to check that the content type of a response matches the protocol's content type before it tries searching the payload for a shape id.
This updates query serialization to no longer add an extraneous `&` as well as ensures that the path starts with `/`.
| self._await_output(self._execute_request(call, request_future)) | ||
| ) | ||
| request_context = await request_future | ||
| input_stream = self.protocol.create_event_publisher( | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It doesn't look like we have a concrete implementation for either the publisher or receiver. The old workflow looks like it was calling execute on the operation directly from the _duplex_stream method. Is the intent we need to implement that for every protocol going forward?
# Conflicts: # packages/smithy-http/src/smithy_http/serializers.py
This updates the async signer to use the special payload hash for event stream operations.
This replaces the generated request pipeline with one that is hand-written. Look at all those removed lines!
Note that CI won't pass because we need quite a few fixes. I'm splitting those off so that this isn't too big, and I've changed the target branch to a feature branch.
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.