-
Notifications
You must be signed in to change notification settings - Fork 54
fix: tests, improvements #302
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThe changes in this pull request involve modifications to several files primarily within the Changes
Possibly related PRs
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
🧹 Outside diff range and nitpick comments (5)
examples/example-vite-react-app-recs/src/dojo/setup.ts (2)
91-91
: LGTM:getSync
added to return object.The addition of
getSync
to the return object is correct and consistent with its initialization earlier in the function. This makes the new functionality available to consumers of thesetup
function.Consider adding a brief comment above the
getSync
property in the return object to explain its purpose, similar to other properties likeeventSync
. This would improve code readability and maintainability.
Line range hint
1-93
: Overall, the changes look good with some areas for improvement.The introduction of
getSyncEntities
and the newgetSync
functionality appears to be a valuable addition to the setup process. However, there are a few points to address:
- Clarify the purpose and types of parameters passed to
getSyncEntities
.- Consider adding comments to explain the new functionality and its significance in the overall setup process.
- Ensure consistent usage of
getSyncEntities
across the project.These improvements will enhance the code's readability and maintainability.
As this change introduces new synchronization functionality, it might be worth considering how this impacts the overall architecture of the application. Does this change affect performance or introduce any new dependencies that need to be managed? It may be beneficial to update any relevant documentation or architecture diagrams to reflect this new capability.
packages/state/src/recs/index.ts (3)
52-52
: LGTM. Consider updating the documentation.The change to set
logging
default tofalse
is good for reducing unnecessary output in production. However, ensure that the function's documentation is updated to reflect this change in default behavior.Update the function's JSDoc comment to indicate the new default value for the
logging
parameter.
202-202
: LGTM. Update the documentation for consistency.The change to set
logging
default tofalse
is consistent with the previous function and good for reducing unnecessary output. Ensure that the function's documentation is updated to reflect this change in default behavior.Update the function's JSDoc comment to indicate the new default value for the
logging
parameter.
Line range hint
313-373
: LGTM. Enhanced logging improves debugging capabilities.The changes to the
setEntities
function are well-implemented:
- Setting
logging
default tofalse
is consistent with other functions.- The new logging statements provide valuable debugging information.
- The added error check for undefined or invalid conversion results is a good safeguard.
These improvements will make it easier to diagnose issues during development and testing.
Consider using a logging library or a custom logger function to handle these log statements. This would allow for easier management of log levels and output formatting in the future.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
⛔ Files ignored due to path filters (1)
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (5)
- examples/example-vite-react-app-recs/src/dojo/setup.ts (3 hunks)
- packages/state/package.json (1 hunks)
- packages/state/src/tests/utils.test.ts (2 hunks)
- packages/state/src/recs/index.ts (5 hunks)
- packages/state/src/utils/index.ts (3 hunks)
🧰 Additional context used
🔇 Additional comments (21)
examples/example-vite-react-app-recs/src/dojo/setup.ts (1)
3-3
: LGTM: Import statement updated correctly.The import statement has been properly updated to include
getSyncEntities
, which is used later in the file. This change is consistent with the new functionality being added.packages/state/src/recs/index.ts (1)
Line range hint
1-377
: Overall, these changes improve code quality and debugging capabilities.The modifications in this file demonstrate a consistent approach to:
- Reduce default verbosity by setting
logging
tofalse
by default.- Enhance debugging capabilities, particularly in the
setEntities
function.- Improve error handling and reporting during the entity update process.
These changes will likely result in cleaner production logs and easier troubleshooting during development and testing phases.
To further improve the codebase:
- Update the JSDoc comments for all modified functions to reflect the new default
logging
behavior.- Consider implementing a centralized logging mechanism to manage log levels and formatting consistently across the entire module.
packages/state/src/utils/index.ts (5)
Line range hint
1-46
: Refactoring improves readability and maintainabilityThe extraction of complex logic into helper functions (
handleStringArray
,handleBigInt
,handleDefault
) simplifies theconvertValues
function and enhances readability and maintainability.
8-10
: Null check correctly handles null and undefined valuesThe use of
value == null
effectively checks for bothnull
andundefined
, ensuring that these cases are appropriately handled.
20-21
: Integration ofhandleStringArray
enhances modularityBy delegating string array handling to
handleStringArray
, the code becomes more modular and easier to maintain.
28-29
: Integration ofhandleBigInt
improves code clarityUsing
handleBigInt
centralizes BigInt conversion logic, making the code cleaner and more maintainable.
40-41
: Use ofhandleDefault
function enhances extensibilityDelegating to
handleDefault
in the default case allows for easier extension and handling of additional types in the future.packages/state/src/__tests__/utils.test.ts (14)
1-2
: Imports updated appropriatelyThe addition of
Schema
andvi
to the imports ensures that all necessary dependencies are available for the extended test cases.
112-127
: Test for handling null and undefined values is comprehensiveThis test correctly verifies that
convertValues
can handleundefined
values in the input, ensuring that optional fields are managed appropriately without causing errors.
128-139
: Accurate conversion of enum types to stringsThe test ensures that enum values are correctly converted to their string representations, validating the handling of enum types within
convertValues
.
141-152
: Proper handling of empty arrays inRecsType.StringArray
This test confirms that
convertValues
correctly processes an empty array forRecsType.StringArray
, which is important for cases where array fields may legitimately be empty.
154-171
: Correct conversion of enum items withinStringArray
The test verifies that
convertValues
can handle arrays containing enum values, converting each enum item to its string representation as expected.
195-214
: Effective handling of BigInt conversion failuresThe test appropriately mocks
console.warn
to suppress warnings during execution and verifies that a warning is issued when BigInt conversion fails, ensuring that such cases are properly logged.
216-227
: Validation ofRecsType.String
conversionThis test confirms that string values are correctly returned by
convertValues
, ensuring basic string handling functions as expected.
229-240
: Accurate conversion of valid BigInt valuesThe test ensures that
convertValues
successfully converts string representations of BigInt numbers into actualBigInt
types, verifying the function's ability to handle large integers.
242-253
: Correct handling of boolean valuesThis test verifies that
convertValues
processes boolean values accurately, confirming proper handling ofRecsType.Boolean
.
255-266
: Proper conversion of numeric strings to numbersThe test ensures that numeric string values are converted to numbers correctly, validating the numeric conversion logic within
convertValues
.
268-295
: Successful processing of nested structuresThe test demonstrates that
convertValues
can handle nested schemas, converting nested values appropriately. This is crucial for complex data structures that contain multiple layers.
323-334
: Appropriate fallback for unknown typesThe test confirms that when
convertValues
encounters an unknown type, it falls back to returning the original string value. This ensures robustness against unexpected input types.
336-343
: Correct behavior with empty schemaThis test verifies that
convertValues
returns an empty object when provided with an empty schema, which is the expected behavior in such cases.
297-321
: Verify handling ofMap
structures inconvertValues
In this test case,
values.config.value
is aMap
object. Ensure thatconvertValues
is capable of processingMap
structures. IfconvertValues
expects plain objects, the test might not behave as intended, and the handling ofMap
may need to be adjusted.Run the following script to verify if
convertValues
handlesMap
inputs:
@@ -25,7 +25,7 @@ | |||
}, | |||
"dependencies": { | |||
"@dojoengine/recs": "2.0.13", | |||
"@dojoengine/torii-client": "workspace:*", | |||
"@dojoengine/torii-client": "1.0.0-alpha.20", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
Inconsistent @dojoengine/torii-client
Versions Detected
The packages/state/package.json
is using version 1.0.0-alpha.20
for @dojoengine/torii-client
, while other packages are referencing it as workspace:*
. This discrepancy may lead to version conflicts and affect dependency management across the project.
- Action Required:
- Update all
package.json
files to use a consistent version for@dojoengine/torii-client
. - Alternatively, revert
packages/state/package.json
to useworkspace:*
if the specific versioning is not necessary.
- Update all
🔗 Analysis chain
LGTM! Consider the impact on development workflow.
The change from workspace:*
to a specific version 1.0.0-alpha.20
for @dojoengine/torii-client
is approved. This update aligns with the package version, which is a good practice for version consistency.
However, please consider the following:
- This change might impact the development workflow, especially if you were previously using a local workspace reference.
- Ensure that this version is consistent with other packages in the project that might depend on
@dojoengine/torii-client
.
To verify version consistency across the project, run the following script:
This script will help identify any inconsistencies in the usage of @dojoengine/torii-client
across the project.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for inconsistencies in @dojoengine/torii-client versions across the project
# Search for @dojoengine/torii-client in all package.json files
echo "Searching for @dojoengine/torii-client in package.json files:"
rg --type json '"@dojoengine/torii-client"' -g 'package.json'
# Search for import statements of @dojoengine/torii-client in TypeScript/JavaScript files
echo -e "\nSearching for import statements of @dojoengine/torii-client:"
rg --type-add 'ts:*.{ts,tsx,js,jsx}' --type ts 'from\s+[''"]@dojoengine/torii-client[''"]'
Length of output: 3633
const getSync = await getSyncEntities( | ||
toriiClient, | ||
contractComponents as any, | ||
undefined, | ||
[], | ||
3000, | ||
true | ||
); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
Clarification Needed on getSyncEntities
Parameters
The initialization of getSync
looks good, but there are a few points that need clarification:
- The
contractComponents
is cast toany
. Is this necessary, or can we use a more specific type? - Could you explain the purpose of the hardcoded values (
undefined
, empty array,3000
,true
)? - Consider adding comments to explain the significance of these parameters for better maintainability.
🔗 Analysis chain
Please clarify the parameters passed to getSyncEntities
.
The initialization of getSync
looks good, but there are a few points that need clarification:
- The
contractComponents
is cast toany
. Is this necessary, or can we use a more specific type? - Could you explain the purpose of the hardcoded values (undefined, empty array, 3000, true)?
- Consider adding comments to explain the significance of these parameters for better maintainability.
To ensure getSyncEntities
is used consistently across the project, let's check its usage:
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for other usages of getSyncEntities
rg --type typescript 'getSyncEntities\(' -A 5
Length of output: 82
Script:
#!/bin/bash
# Search for other usages of getSyncEntities in TypeScript files
rg --type ts --type tsx 'getSyncEntities\(' -A 5
Length of output: 78
Script:
#!/bin/bash
# Search for other usages of getSyncEntities in .ts and .tsx files
rg 'getSyncEntities\(' --glob '*.ts' --glob '*.tsx' -A 5
Length of output: 4520
function handleStringArray(value: any) { | ||
if (value.type === "array" && value.value.length === 0) { | ||
return []; | ||
} | ||
if (value.type === "array" && value.value[0]?.type === "enum") { | ||
return value.value.map((item: any) => item.value.option); | ||
} | ||
return value.value.map((a: any) => { | ||
try { | ||
return BigInt(a.value); | ||
} catch (error) { | ||
console.warn( | ||
`Failed to convert ${a.value} to BigInt. Using string value instead.` | ||
); | ||
return a.value; | ||
} | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ensure consistent return types in handleStringArray
The handleStringArray
function may return an array containing both BigInt
s and string
s if BigInt
conversion fails. This could lead to inconsistent data types and potential issues downstream.
Consider ensuring that the function returns an array of consistent types. You may decide to handle conversion failures differently or provide a default value.
function handleStringArray(value: any) { | ||
if (value.type === "array" && value.value.length === 0) { | ||
return []; | ||
} | ||
if (value.type === "array" && value.value[0]?.type === "enum") { | ||
return value.value.map((item: any) => item.value.option); | ||
} | ||
return value.value.map((a: any) => { | ||
try { | ||
return BigInt(a.value); | ||
} catch (error) { | ||
console.warn( | ||
`Failed to convert ${a.value} to BigInt. Using string value instead.` | ||
); | ||
return a.value; | ||
} | ||
}); | ||
} | ||
|
||
function handleBigInt(value: string | bigint) { | ||
if (typeof value === "bigint") { | ||
return value; | ||
} | ||
try { | ||
return BigInt(value); | ||
} catch (error) { | ||
console.warn( | ||
`Failed to convert ${value} to BigInt. Attempting hexadecimal conversion.` | ||
); | ||
try { | ||
return BigInt(`0x${value}`); | ||
} catch (hexError) { | ||
console.warn( | ||
`Failed to convert 0x${value} to BigInt. Using string value instead.` | ||
); | ||
return value; | ||
} | ||
} | ||
} | ||
|
||
function handleDefault(schemaType: any, value: any) { | ||
if (typeof schemaType === "object" && value.type === "struct") { | ||
if (value.value instanceof Map) { | ||
const structValues = Object.fromEntries(value.value); | ||
return convertValues(schemaType, structValues); | ||
} else if (typeof value.value === "object") { | ||
// Handle cases where value.value might already be a plain object | ||
return convertValues(schemaType, value.value); | ||
} else { | ||
console.warn( | ||
`Expected value.value to be a Map or object for struct type, got ${typeof value.value}.` | ||
); | ||
return value.value; | ||
} | ||
} | ||
if (Array.isArray(schemaType) && value.type === "array") { | ||
return value.value.map((item: any) => | ||
convertValues(schemaType[0], item) | ||
); | ||
} | ||
return value.value; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Enhance type safety by specifying parameter and return types
The helper functions use any
for parameters and return types, which reduces the benefits of TypeScript's type checking.
Consider specifying more precise types for parameters and return values to improve type safety and catch potential errors at compile time.
if (typeof value === "bigint") { | ||
return value; | ||
} | ||
try { | ||
return BigInt(value); | ||
} catch (error) { | ||
console.warn( | ||
`Failed to convert ${value} to BigInt. Attempting hexadecimal conversion.` | ||
); | ||
try { | ||
return BigInt(`0x${value}`); | ||
} catch (hexError) { | ||
console.warn( | ||
`Failed to convert 0x${value} to BigInt. Using string value instead.` | ||
); | ||
return value; | ||
} | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Simplify error handling in handleBigInt
function
The nested try...catch
blocks in handleBigInt
can be flattened to improve readability.
Consider restructuring the error handling to eliminate nesting:
function handleBigInt(value: string | bigint) {
if (typeof value === "bigint") {
return value;
}
try {
return BigInt(value);
} catch (error) {
try {
return BigInt(`0x${value}`);
} catch (hexError) {
console.warn(
`Failed to convert ${value} to BigInt. Using string value instead.`
);
return value;
}
}
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
if (typeof value === "bigint") { | |
return value; | |
} | |
try { | |
return BigInt(value); | |
} catch (error) { | |
console.warn( | |
`Failed to convert ${value} to BigInt. Attempting hexadecimal conversion.` | |
); | |
try { | |
return BigInt(`0x${value}`); | |
} catch (hexError) { | |
console.warn( | |
`Failed to convert 0x${value} to BigInt. Using string value instead.` | |
); | |
return value; | |
} | |
} | |
} | |
function handleBigInt(value: string | bigint) { | |
if (typeof value === "bigint") { | |
return value; | |
} | |
try { | |
return BigInt(value); | |
} catch (error) { | |
try { | |
return BigInt(`0x${value}`); | |
} catch (hexError) { | |
console.warn( | |
`Failed to convert ${value} to BigInt. Using string value instead.` | |
); | |
return value; | |
} | |
} | |
} |
if (typeof schemaType === "object" && value.type === "struct") { | ||
if (value.value instanceof Map) { | ||
const structValues = Object.fromEntries(value.value); | ||
return convertValues(schemaType, structValues); | ||
} else if (typeof value.value === "object") { | ||
// Handle cases where value.value might already be a plain object | ||
return convertValues(schemaType, value.value); | ||
} else { | ||
console.warn( | ||
`Expected value.value to be a Map or object for struct type, got ${typeof value.value}.` | ||
); | ||
return value.value; | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add null checks for value.value
in handleDefault
In handleDefault
, accessing value.value
without checking for null
could lead to runtime errors. Since typeof null
returns "object"
, the check typeof value.value === "object"
may pass when value.value
is actually null
.
Consider adding a null check to ensure value.value
is not null
:
} else if (typeof value.value === "object" && value.value !== null) {
// Handle cases where value.value might already be a plain object
return convertValues(schemaType, value.value);
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
if (typeof schemaType === "object" && value.type === "struct") { | |
if (value.value instanceof Map) { | |
const structValues = Object.fromEntries(value.value); | |
return convertValues(schemaType, structValues); | |
} else if (typeof value.value === "object") { | |
// Handle cases where value.value might already be a plain object | |
return convertValues(schemaType, value.value); | |
} else { | |
console.warn( | |
`Expected value.value to be a Map or object for struct type, got ${typeof value.value}.` | |
); | |
return value.value; | |
} | |
} | |
if (typeof schemaType === "object" && value.type === "struct") { | |
if (value.value instanceof Map) { | |
const structValues = Object.fromEntries(value.value); | |
return convertValues(schemaType, structValues); | |
} else if (typeof value.value === "object" && value.value !== null) { | |
// Handle cases where value.value might already be a plain object | |
return convertValues(schemaType, value.value); | |
} else { | |
console.warn( | |
`Expected value.value to be a Map or object for struct type, got ${typeof value.value}.` | |
); | |
return value.value; | |
} | |
} |
it("should handle RecsType.StringArray with BigInt conversion", () => { | ||
const schema: Schema = { | ||
ids: RecsType.StringArray, | ||
}; | ||
const values = { | ||
ids: { | ||
value: [ | ||
{ value: "12345678901234567890", type: "string" }, | ||
{ value: "98765432109876543210", type: "string" }, | ||
], | ||
type: "array", | ||
}, | ||
}; | ||
const expected = { | ||
ids: [ | ||
BigInt("12345678901234567890"), | ||
BigInt("98765432109876543210"), | ||
], | ||
}; | ||
expect(convertValues(schema, values)).toEqual(expected); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential inconsistency: RecsType.StringArray
expected to return strings
In this test case, RecsType.StringArray
is used in the schema, but the expected result contains an array of BigInt
values. Since RecsType.StringArray
implies an array of strings, converting items to BigInt
may be inconsistent with the type definition. Consider using a type like RecsType.BigIntArray
if available, or adjust the expected output to reflect an array of strings.
Summary by CodeRabbit
New Features
Bug Fixes
convertValues
function with expanded test coverage for various input scenarios.Refactor
convertValues
function by extracting complex logic into dedicated helper functions.Documentation
package.json
for better version management.