Skip to content

Feature/ff local server #70

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
May 5, 2020
Merged

Conversation

fabianfett
Copy link
Member

Implemented some of the comments.

@fabianfett fabianfett force-pushed the feature/ff-local-server branch from d67c801 to f0f5e4b Compare May 5, 2020 19:41
private var processing = CircularBuffer<(head: HTTPRequestHead, body: ByteBuffer?)>()

private static var queue = [Pending]()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use CircularBuffer, and then you can pop as optional

}
guard requestId == pending.requestId else {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I used a dictionary to make sure we bring the right pending request, this will make it order based in case you have some concurrency between the client submitting work (eg iOS app) and the lambda polling/processing work (lambda code)

@@ -88,13 +88,20 @@ private enum LocalLambda {
}

final class HTTPHandler: ChannelInboundHandler {

enum InvocationState {
case waitingForNextRequest
Copy link
Contributor

@tomerd tomerd May 5, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

waitingForLambdaRequest


enum InvocationState {
case waitingForNextRequest
case idle(EventLoopPromise<Pending>)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

waitingForInvocation

enum InvocationState {
case waitingForNextRequest
case idle(EventLoopPromise<Pending>)
case processing(Pending)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

waitingForLambdaResponse

@@ -137,43 +144,63 @@ private enum LocalLambda {
self.writeResponse(context: context, response: .init(status: .internalServerError))
}
}
Self.queueLock.withLock {
Self.queue[requestId] = Pending(requestId: requestId, request: work, responsePromise: promise)
let pending = Pending(requestId: requestId, request: work, responsePromise: promise)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pending -> Invocation

}

// pop the first task from the queue
switch !Self.queue.isEmpty ? Self.queue.removeFirst() : nil {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

circular buffer will make this easier

@tomerd tomerd merged commit 5089380 into feature/local-server May 5, 2020
@tomerd
Copy link
Contributor

tomerd commented May 5, 2020

@fabianfett merged this and will continue to iterate on the original branch

fabianfett added a commit that referenced this pull request May 7, 2020
* Don’t exit immediately
* Removed locks. Just running in one EL
fabianfett added a commit that referenced this pull request May 12, 2020
* Don’t exit immediately
* Removed locks. Just running in one EL
fabianfett added a commit that referenced this pull request May 12, 2020
* Don’t exit immediately
* Removed locks. Just running in one EL
fabianfett added a commit that referenced this pull request May 13, 2020
* Don’t exit immediately
* Removed locks. Just running in one EL
@fabianfett fabianfett deleted the feature/ff-local-server branch May 19, 2020 15:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants