-
Notifications
You must be signed in to change notification settings - Fork 207
What happens when I return a promise in Node.js functions? #431
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Check with Matt Mason. - /cc @mamaso |
Is there a difference between calling Examples of each approach: |
I think these are the equivalences - it's all sugar: context.done() -> promise that resolves without a value |
@christopheranderson yep, those are the semantics. Code which applies the I can see some potential confusion around
|
That's a good catch in the $return semantics. |
To be fair |
Agreed on the It talks about the value but then there is no return called. |
Should have made it like |
I am really having a heck of time trying to wrap my head around I've tried the examples outlined using promises syntax or context syntax but neither actually worked for me. After much too long of trial and error, I've put together a very rudimentary example of taking a query string from request (e.g. a URL) and pass it along to a function to fetch the website data and return the content:
I'm actually surprised the above even works - Just about every attempt prior to this has resulted in the line for I don't have much of a point here other than to +1 for some additional async javascript documentation within Azure functions. Personally, I'm finding stumbling blocks particularly around the |
@iyerusad Yeah, we definitely need documentation on how you can assign an output binding to take the value of This should be an easier way to do what you were trying to do:
Note that the named value "$return" makes it so that the returned value in your function is assigned to that output binding (since you can have multiple output bindings per trigger). index.js
Note that I did not do a "try/catch", as you did, since unhandled errors are caught by the runtime and logged and will return a 500. If you did want to do custom error handling and logic around it, I would recommend something like below. Note that (mainly for clarity) I did not define a function within the exported function.
|
My function.json had contained Thinking aloud: Changing the http->out name binding to use Can you further expound on:
Much appreciate the insight @mhoeger - I must of read several hundred posts on async and promises in the last few days and this actually is the first this started falling into place and functioning as expected. |
@iyerusad - You got it! A few notes on the two points you mentioned:
This is also valid:
index.js
This snippet of code from the azure functions internals that run your code may be helpful in understanding what our expectations around promises are. If a Promise is returned (an Object with a
A quick example:
index.js:
|
Agreed, so much so, this probably should make its way into default Thank you very much for the link from pouchDB developer around Promises - Some of the advanced mistakes outlined are still fuzzy, but it helps wrapping my head around what is apparently a better but still tricky approach for async code. The multiple output bindings example is also very helpful: It probably would be very common to have an azure function return a 200 over http (aka "received your input!"), as well as toss something into a message bus for further processing. Coming from a systems automation background, this quote (from the mentioned article) really resonates with me:
Suggested take-away (for when writing docs): As a newbie, knowing to change host.json to use Much appreciate the guidance @mhoeger |
Ahh - yeah that's a very good point. Thanks for the feedback @iyerusad!! I'm torn between changing the template to |
Tracking template updates in Azure/azure-functions-templates#771 Updated docs here: https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-node#bindings |
Leaving note for anyone else/future me (nothing actionable, just speaking aloud): A dozen or so azure functions later, I see one pitfall of using
Last return is never hit because
The above helped me better visualize why I would want to use Also, random unexpected/but understandable thing:
returns "blah". To return actual obj (with all its content), had to do this
|
If anyone has the time and the patience, would you look through the below under a lens of a code/peer review? If not, no worries - at a basic level it works. -- Below I drafted a basic wrapper to upload a blob to Azure Blob Storage (with the assumption such a basic function would be one off-ed like this). From the standpoint of
const appInsights = require("applicationinsights");
if (process.env.APPINSIGHTS_INSTRUMENTATIONKEY) {appInsights.setup(process.env.APPINSIGHTS_INSTRUMENTATIONKEY).start()}
const { BlobServiceClient, StorageSharedKeyCredential } = require('@azure/storage-blob');
// Spec: Store blobs in Azure Blob storage
// Inputs - POST
//// - [REQUIRED] req.body - JSON containing meta and file
//// {
//// account: "",
//// accountKey: "",
//// container: "",
//// file: {
//// type: "", //optional
//// name: "",
//// b64: ""
//// }
//// }
//// 1. Check for approriate fields supplied
//// 2. Attempt uploading blob
// Outputs
//// Success: 200/201 - String - `${blobAccountURL}/${container}/${blobName}`
//// Failure: 422 - JSON - { message: "Failed to upload blob: errors returned", errors: `${Comma deliminted string of errors}` }
module.exports = async function (context, req) {
context.log.info('Started Function: azure-blob-store');
const errors = []; //Casting as Array
let account;
let accountkey;
let blobAccountURL;
let container;
let blobName;
let blobContent;
let storedFile;
if (req.body && req.body.account) {
account = req.body.account;
blobAccountURL = `https://${account}.blob.core.windows.net`
} else {let e = "Missing account name"; context.log.error(e); errors.push(e)}
if (req.body && req.body.accountkey) {
accountkey = req.body.accountkey;
} else {let e = "Missing account key"; context.log.error(e); errors.push(e)}
if (req.body && req.body.container) {
container = req.body.container;
} else {let e = "Missing container name"; context.log.error(e); errors.push(e)}
if (req.body && req.body.file && req.body.file.name) {
blobName = req.body.file.name;
} else {let e = "Missing file name"; context.log.error(e); errors.push(e)}
if (req.body && req.body.file && req.body.file.b64) {
blobContent = Buffer.from(req.body.file.b64, 'base64');
} else {let e = "Missing file content (in b64)"; context.log.error(e); errors.push(e)}
if (account && container && accountkey && blobName && blobContent) {
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountkey); //Craft sharedKeyCredential
const blobServiceClient = new BlobServiceClient(blobAccountURL, sharedKeyCredential); //Craft blobServiceClient
const containerClient = blobServiceClient.getContainerClient(container); //Craft containerClient
const blockBlobClient = containerClient.getBlockBlobClient(blobName); //Craft blockBlobClient
//Upload blob content
context.log.info("Attempting to upload blob")
await blockBlobClient.upload(blobContent, Buffer.byteLength(blobContent))
.then(r => {
context.log.info("Success in uploading blob")
storedFile = r;
})
.catch(err => {
let e = `Failed attempting to upload blob: ${blobAccountURL}/${container}/${blobName}`;
errors.push(e)
errors.push(`${err}`)
context.log.error(e)
context.log.error(`${err}`)
})
}
if (storedFile && errors.length == 0) {
context.log.info(`Exit: 200; Successfully uploaded blob: ${blobAccountURL}/${container}/${blobName}`)
context.res = {
status: 200,
body: `${blobAccountURL}/${container}/${blobName}`
}
}
if (errors.length > 0) {
context.log.warn(`Exit: 422; Failed to upload blob due to errors`)
context.res = {
status: 422,
body: {
message: "Failed to upload blob: errors returned",
errors: `${errors.join(", ")}`
}
}
}
//return context.res //<-- is implied; Whatever is set in context.res returns
}; |
@iyerusad - tbh I love these types of questions :) As a disclaimer, I would take everything I say with a grain of salt and that they're my opinions on what makes sense. I've heard (and agree) that code clarity is what's most important (over fancy things), so please take things that add clarity and leave things that don't! One approach to the "redundant" error handling is to make a helper method like this, where you pass in the errors array and modify that as a side-effect of your code. // In same file, just below module.exports or somewhere else - just a helper function
function handleError(log, errorMessage, exception) {
log.error(errorMessage);
errors.push(errorMessage);
if (exception) {
log.error(JSON.stringify(exception));
errors.push(exception);
}
return errors;
}
// called like
// else {
// handleError(errors, context.log, "Missing this thing");
// }
// or
// catch (exception) {
// handleError(errors, context.log, "Failed to do this", exception);
// } I'm a little torn on the above because I don't like having to pass in "errors" and "log". Especially weird is that we're modifying "errors" in the method and then expecting that consequence to happen to the original input too (whereas if our handleError did errors = [], the input wouldn't be changed). We could also explicitly return errors, but that adds redundant code too. One kind-of out there approach is to give each function invocation an "environment" by making it into an object. Take it or leave it, I just wanted to mention it! Sorry about the bad class names, still iterating on this idea :) module.exports = async (context, req) => {
let blobMaker = new BlobMaker(context, req);
return blobMaker.run();
}
class BlobMaker {
context;
log;
errors = [];
account;
constructor(context, req) {
this.log = context.log;
this.account = this.validateInput("account", req.body && req.body.account);
this.context = context;
// warning: don't do something like this.res = context.res, because assigning to
// res will not work as expected! Log is ok because we're calling a method on it,
// not assigning to it.
}
async run() {
this.log.info('Started Function: azure-blob-store');
if (this.account) {
this.log("OMG!!");
return {
status: 200,
body: "ok!!"
}
} else {
return {
status: 500,
body: "womp womp :( " + JSON.stringify(this.errors)
}
}
}
validateInput(parameterName, parameter) {
if (parameter) {
return parameter
} else {
let e = `Missing ${parameterName}`;
handleError(e);
}
}
handleError(errorMessage) {
this.log.error(e);
this.errors.push(e);
}
// whatever other methods that help break this up! If you use TypeScript, you could
// also mark these as "private"
} The advantage here is that you can more easily break apart code into manageable bits while accessing commonly needed things like "log", "context", and "req". So here, we set it up so that each invocation creates a new instance of an object and runs the needed method on it. A suggestion for your "//Upload blob content" code: // Instead of a .then and .catch, use try/catch with await! i think this more clearly represents
// what you're trying to do
try {
storedFile = await blockBlobClient.upload(blobContent, Buffer.byteLength(blobContent));
} catch (err) {
let e = `Failed attempting to upload blob: ${blobAccountURL}/${container}/${blobName}`;
errors.push(e)
errors.push(`${err}`)
context.log.error(e)
context.log.error(`${err}`)
} For http outputs, my personal preference is to use the "$return" binding and "return" the http response to make sure that code afterwards doesn't change it. Although I would only recommend this with V3, where this bug is fixed: Azure/azure-functions-nodejs-worker#228 Another note, you might want to consider caching the client's you're creating? Here's some guidelines: https://docs.microsoft.com/en-us/azure/azure-functions/manage-connections I know your case is a bit trickier because the blob account changes dynamically - so this is only a suggestion if your inputs tend to be a subset of a few known entities and if you expect this function to receive a lot of concurrent traffic. I might have missed some of your questions, please let me know! |
Got a chance to dig into this today. Very much appreciate the pointers and insight. No obligation to reply, below is my findings/musings. Thank you.
function errorHandler(error, prefix, exception) {
if (error && error.response && error.response.data) {
if (error.response.data.error) {error = error.response.data.error}
else if (error.response.data.errors) {error = error.response.data.errors}
else {error = JSON.stringify(error.response.data)}
}
if (prefix) {error = prefix + " " + error}
if (exception) {exceptions.push(`${error}`)}
else {errors.push(`${error}`)}
context.log.error(`${error}`)
} I added a prefix component ( Here is real world example how that looks from a failure within a Function -> SubFunction (workflow-processSubmission) -> SubFunction (Document-store) -> SubFunction (helper-getcontact)
or another (where blob accountkey is incorrect):
//Upload blob content
try {
context.log.info("Attempting to upload blob")
let options = { blobHTTPHeaders: { blobContentType: mimeType } };
await blockBlobClient.upload(blobContent, Buffer.byteLength(blobContent), options)
context.log.info("Success in uploading blob")
storedFile = blobURL;
} catch(err) {
errorHandler(err, "blockBlobClient.upload() ->")
} My C# class professor did tell us to use try/catch sparingly since it was an expensive operation, latest reading seems to indicate its only expensive if actual exception is thrown (which I understand as: flag all expected issues before try/catch, leaving the catching of exceptional for something truly exceptional).
Tweaked iteration: const appInsights = require("applicationinsights");
if (process.env.APPINSIGHTS_INSTRUMENTATIONKEY) {appInsights.setup(process.env.APPINSIGHTS_INSTRUMENTATIONKEY).start()}
const {
BlobServiceClient,
StorageSharedKeyCredential,
generateBlobSASQueryParameters,
BlobSASPermissions
} = require('@azure/storage-blob');
// Spec: Store blobs in Azure Blob storage
//// Security: Anonymous - Wrapper around Azure Storage Library; auth is from request. No auth is provided by wrapper.
// Inputs - POST
//// - [REQUIRED] req.body - JSON containing meta and file
//// {
//// account: "",
//// accountKey: "",
//// container: "",
//// file: {
//// name: "",
//// b64: "",
//// mime: "",
//// SASify: [datetime] //optional, can be "true" or a custom expiration date
//// }
//// }
//// 1. Check for appropriate fields supplied
//// 2. Attempt uploading blob
// Outputs
//// Success: 200 - String - `${blobAccountURL}/${container}/${blobName}`
//// Failure: 422 - JSON - Errors encountered
//// Failure: 500 - JSON - Fatal exceptions encountered
module.exports = async function (context, req) {
context.log.info('Started Function: azure-blob-store');
function isDate(string) {
if (typeof string === "boolean") {string = "invalid"}
return (new Date(string) !== "Invalid Date") && !isNaN(new Date(string));
}
function errorHandler(error, prefix, exception) {
if (error && error.response && error.response.data) {
if (error.response.data.error) {error = error.response.data.error}
else if (error.response.data.errors) {error = error.response.data.errors}
else {error = JSON.stringify(error.response.data)}
}
if (prefix) {error = prefix + " " + error}
if (exception) {exceptions.push(`${error}`)}
else {errors.push(`${error}`)}
context.log.error(`${error}`)
}
const errors = []; //Casting as Array
const exceptions = []; //Casting as Array
let account;
let accountkey;
let blobAccountURL;
let container;
let blobName;
let blobContent;
let mimeType;
let SASstartTime;
let SASexpireTime;
let SAStoken;
let blobURL;
let storedFile;
if (req.body && req.body.account) {
account = req.body.account;
blobAccountURL = `https://${account}.blob.core.windows.net`
} else {errorHandler("Missing account")}
if (req.body && req.body.accountkey) {
accountkey = req.body.accountkey;
} else {errorHandler("Missing accountKey")}
if (req.body && req.body.container) {
container = req.body.container;
} else {errorHandler("Missing container")}
if (req.body && req.body.file && req.body.file.name) {
blobName = req.body.file.name;
} else {errorHandler("Missing file.name")}
if (req.body && req.body.file && req.body.file.b64) {
blobContent = Buffer.from(req.body.file.b64, 'base64');
} else {errorHandler("Missing file content in file.b64 (as base64)")}
if (req.body && req.body.file && req.body.file.mime) {
mimeType = req.body.file.mime;
} else {errorHandler("Missing file.mime")}
if (req.body && req.body.file && req.body.file.SASify) {
//Start time
SASstartTime = new Date(new Date().setMinutes(new Date().getMinutes() - 5)); // Set start time to 5 minutes ago to avoid clock skew.
//Expire Time
if (isDate(req.body.file.SASify)) {
SASexpireTime = new Date(req.body.file.SASify) //Use datetime from request for expiration time
} else {
SASexpireTime = new Date(new Date().setMinutes((new Date().getMinutes() + 30))); //30 minute default or if invald date provided
}
} //optional parameter
if (blobAccountURL && container && blobName) {
blobURL = encodeURI(`${blobAccountURL}/${container}/${blobName}`)
}
if (account && container && accountkey && blobName && blobContent && mimeType) {
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountkey); //Craft sharedKeyCredential
const blobServiceClient = new BlobServiceClient(blobAccountURL, sharedKeyCredential); //Craft blobServiceClient
const containerClient = blobServiceClient.getContainerClient(container); //Craft containerClient
const blockBlobClient = containerClient.getBlockBlobClient(blobName); //Craft blockBlobClient
//TODO: Should check if blob exists before overwriting. Maybe there is overload of blockBlobClient for this?
//Upload blob content
try {
context.log.info("Attempting to upload blob")
let options = { blobHTTPHeaders: { blobContentType: mimeType } };
await blockBlobClient.upload(blobContent, Buffer.byteLength(blobContent), options)
context.log.info("Success in uploading blob")
storedFile = blobURL;
} catch(err) {errorHandler(err, "blockBlobClient.upload() ->")}
if (storedFile && SASstartTime && SASexpireTime) {
try {
SAStoken = generateBlobSASQueryParameters({
containerName: container,
permissions: BlobSASPermissions.parse("r"),
blobName: blobName,
expiresOn: SASexpireTime,
startsOn: SASstartTime
}, sharedKeyCredential
).toString();
} catch(err) {errorHandler(err, "generateBlobSASQueryParameters ->")}
if (SAStoken) {
storedFile += `?${SAStoken}`
}
}
}
if (storedFile && errors.length == 0) {
context.log.info(`Exit: 200; Successfully uploaded blob: ${blobURL}`)
context.res = {
status: 200,
body: storedFile
}
}
if (exceptions.length > 0) {
context.log.warn(`Exit: 500; Encountered fatal error`)
context.res = {
status: 500,
headers: {"Content-Type": "application/json"},
body: {
message: "FATAL failure: server side failure",
errors: `${exceptions.join(", ")}`
}
}
} else if (errors.length > 0) {
context.log.warn(`Exit: 422; Encountered errors during execution`)
context.res = {
status: 422,
headers: {"Content-Type": "application/json"},
body: {
message: "Exit: 422; Encountered errors during execution",
errors: `${errors.join(", ")}`
}
}
}
//return context.res //<-- is implied; Whatever is set in context.res returns
}; |
Uh oh!
There was an error while loading. Please reload this page.
Needs more docs
The text was updated successfully, but these errors were encountered: