Skip to content

Skip 1xx frames in more states (fixes #524) #527

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 24, 2021

Conversation

nox
Copy link
Contributor

@nox nox commented Mar 18, 2021

I couldn't find a way to repro the panic from a test.

@seanmonstar
Copy link
Member

Thanks, I think I understand now! I think a test where a request is sent without a EOS data frame would be able to trigger the panic on master. For example, something like this:

#[tokio::test]
async fn informational_while_local_streaming() {
    h2_support::trace_init!();
    let (io, mut srv) = mock::new();

    let srv = async move {
        let settings = srv.assert_client_handshake().await;
        assert_default_settings!(settings);
        srv.recv_frame(
            frames::headers(1)
                .request("POST", "https://example.com/")
        )
        .await;
        srv.send_frame(frames::headers(1).response(103)).await;
        srv.send_frame(frames::headers(1).response(200).field("content-length", 2))
            .await;
        srv.recv_frame(frames::data(1, "hello").eos()).await;
        srv.send_frame(frames::data(1, "ok").eos()).await;
    };

    let h2 = async move {
        let (mut client, h2) = client::Builder::new()
            .handshake::<_, Bytes>(io)
            .await
            .unwrap();
        tokio::spawn(async {
            h2.await.expect("connection failed");
        });
        let request = Request::builder()
            .method(Method::POST)
            .uri("https://example.com/")
            .body(())
            .unwrap();
        // don't EOS stream yet..
        let (response, body_tx) = client.send_request(request, false).unwrap();
        // eventual response is 200, not 103
        let resp = response.await.expect("response");
        assert_eq!(resp.status(), 200);
        // now we can end the stream
        body_tx.send_data(b"hello", true).expect("send_data");
        let mut body = resp.into_body();
        assert_eq!(body.data().await.unwrap().unwrap(), "ok");
    };

    join(srv, h2).await;
}

@nox
Copy link
Contributor Author

nox commented Mar 23, 2021

That test doesn't expose the panic either.

On master, it panics with:

thread 'informational_while_local_streaming' panicked at 'assertion failed: `(left == right)`
  left: `Reset { stream_id: StreamId(1), error_code: PROTOCOL_ERROR }`,
 right: `Data { stream_id: StreamId(1), flags: (0x1: END_STREAM) }`: assert_frame_eq', /Users/nox/src/h2/tests/h2-support/src/assert.rs:96:13
stack backtrace:
   7: client_request::informational_while_local_streaming::{{closure}}::{{closure}}
             at ./tests/client_request.rs:1273:9

Line 1273 being:

srv.recv_frame(frames::data(1, "hello").eos()).await;

If I comment everything about the request body, like this:

#[tokio::test]
async fn informational_while_local_streaming() {
    h2_support::trace_init!();
    let (io, mut srv) = mock::new();

    let srv = async move {
        let settings = srv.assert_client_handshake().await;
        assert_default_settings!(settings);
        srv.recv_frame(frames::headers(1).request("POST", "https://example.com/"))
            .await;
        srv.send_frame(frames::headers(1).response(103)).await;
        srv.send_frame(frames::headers(1).response(200).field("content-length", 2))
            .await;
        // srv.recv_frame(frames::data(1, "hello").eos()).await;
        // srv.send_frame(frames::data(1, "ok").eos()).await;
    };

    let h2 = async move {
        let (mut client, h2) = client::Builder::new()
            .handshake::<_, Bytes>(io)
            .await
            .unwrap();
        tokio::spawn(async {
            h2.await.expect("connection failed");
        });
        let request = Request::builder()
            .method(Method::POST)
            .uri("https://example.com/")
            .body(())
            .unwrap();
        // don't EOS stream yet..
        let (response, mut body_tx) = client.send_request(request, false).unwrap();
        // eventual response is 200, not 103
        let resp = response.await.expect("response");
        assert_eq!(resp.status(), 200);
        // now we can end the stream
        // body_tx.send_data("hello".into(), true).expect("send_data");
        // let mut body = resp.into_body();
        // assert_eq!(body.data().await.unwrap().unwrap(), "ok");
    };

    join(srv, h2).await;
}

I get this panic:

thread 'informational_while_local_streaming' panicked at 'response: Error { kind: Proto(PROTOCOL_ERROR) }', tests/h2-tests/tests/client_request.rs:1293:35

Line 1293 being:

        let resp = response.await.expect("response");

@nox
Copy link
Contributor Author

nox commented Mar 23, 2021

Btw shouldn't send_open have some logic about informational header frames too?

@seanmonstar
Copy link
Member

shouldn't send_open have some logic about informational header frames too?

I don't believe so. There isn't currently any way in the server API to send a 1xx response.

The test failure suggests to me we did find a case that is hitting a problem (pretty sure it should work), just not the one triggering the original panic. More digging to do!

@nox
Copy link
Contributor Author

nox commented Mar 24, 2021

@seanmonstar So this particular test would be enough to land this PR then?

@nox
Copy link
Contributor Author

nox commented Mar 24, 2021

I added the test, which does pass now when it didn't before, hopefully the root cause is the same as the panic we observed in the wild.

@seanmonstar
Copy link
Member

Yea, let's try it out! At the least, this patch does fix the added test.

@seanmonstar seanmonstar merged commit a1f914f into hyperium:master Mar 24, 2021
@nox nox deleted the fix-524 branch March 24, 2021 16:24
BenxiangGe pushed a commit to BenxiangGe/h2 that referenced this pull request Jul 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants