-
-
Notifications
You must be signed in to change notification settings - Fork 106
merge dev to main (v2.9.0) #1869
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
📝 WalkthroughWalkthroughThe pull request includes updates across multiple files, primarily focusing on enhancing validation mechanisms, updating changelogs, and improving the handling of type definitions within the ZModel language. Key changes involve the addition of new validation rules, modifications to existing functions for better type management, and improvements in error handling. The changelog for the JetBrains IDE plugin has been updated to reflect a new fixed issue regarding validation errors with strongly typed JSON fields. Additionally, several new tests have been introduced to ensure the robustness of these changes. Changes
Possibly related PRs
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 16
🧹 Outside diff range and nitpick comments (76)
tests/regression/tests/issue-1849.test.ts (1)
3-4
: Add description of what's being testedThe test suite lacks a clear description of what issue 1849 is about and what specific regression is being tested.
Consider adding a more descriptive test title:
-describe('issue 1849', () => { - it('regression', async () => { +describe('issue 1849 - enum imports and default values', () => { + it('should allow enum imports and default values in schema', async () => {tests/regression/tests/issue-1835.test.ts (1)
1-28
: Add documentation for the regression scenarioThe test file would benefit from comments explaining:
- What issue Adding
@@ignore
to a model that references an enum fails to generate enhanced client #1835 was about- What specific behavior is being tested
- What the expected outcome should be
Add documentation at the top of the test file:
import { loadSchema } from '@zenstackhq/testtools'; +/** + * Regression test for Issue #1835 + * + * Description: [Add description of the issue] + * Expected behavior: [Add expected behavior] + */ describe('issue 1835', () => {tests/regression/tests/issue-756.test.ts (1)
Line range hint
1-31
: Enhance test coverage for member access validationWhile the test correctly verifies the error case, consider adding positive test cases to ensure valid member access patterns still work as expected.
Consider adding test cases for:
- Valid access to existing fields
- Nested member access
- Different relation types
// Example positive test case it('allows valid member access', async () => { const model = await loadModel(` // ... same schema ... @@allow('all', auth().id == authorId) `); expect(model).toBeDefined(); });packages/schema/src/language-server/utils.ts (1)
32-37
: Enhance JSDoc documentation for better clarityThe implementation is clean and well-structured. Consider enhancing the documentation to provide more context:
/** - * Checks if the given node can contain resolvable members. + * Checks if the given node can contain resolvable members (fields or properties that can be + * referenced and resolved during type checking). + * @param node - The node to check + * @returns True if the node is either a DataModel or TypeDef, indicating it can contain + * resolvable members */tests/regression/tests/issue-1857.test.ts (2)
3-4
: Consider using a more descriptive test nameThe current test name 'regression' is too generic. Consider renaming it to better describe what's being tested, e.g., 'should validate nullable string fields in JSON content'.
- it('regression', async () => { + it('should validate nullable string fields in JSON content', async () => {
43-43
: Consider adding more test casesThe current test only validates one happy path. Consider adding test cases for:
- Invalid type field (required field missing or wrong type)
- Various text values (undefined, empty string, complex string)
- Malformed JSON content
// Example additional test cases: zodSchemas.models.JSONContentSchema.parse({ type: 'foo' }); // text undefined expect(() => zodSchemas.models.JSONContentSchema.parse({ text: 'foo' }) ).toThrow(); // missing required type zodSchemas.models.JSONContentSchema.parse({ type: 'foo', text: '' }); // empty stringpackages/schema/src/plugins/enhancer/index.ts (1)
29-29
: Consider consistent variable namingThe variable
newPrismaClientDtsPath
is later referenced asprismaClientDtsPath
. Consider using consistent naming throughout.- const { dmmf, newPrismaClientDtsPath } = await new EnhancerGenerator(model, options, project, outDir).generate(); + const { dmmf, prismaClientDtsPath } = await new EnhancerGenerator(model, options, project, outDir).generate();packages/sdk/src/code-gen.ts (2)
24-38
: Consider using asynchronous file operationsThe
saveSync
operation could block the event loop for large files. Consider using asynchronous operations for better performance.- sourceFile.saveSync(); + await sourceFile.save();Improve template literal formatting
The indentation in the template literal might cause formatting issues. Consider using a dedicated template for the header.
- sourceFile.replaceWithText( - `/****************************************************************************** -* This file was generated by ZenStack CLI ${pkgJson.version}. -******************************************************************************/ - -/* eslint-disable */ -// @ts-nocheck - - ${sourceFile.getText()} - ` + const header = [ + '/******************************************************************************', + `* This file was generated by ZenStack CLI ${pkgJson.version}.`, + '*****************************************************************************/', + '', + '/* eslint-disable */', + '// @ts-nocheck', + '', + ].join('\n'); + sourceFile.replaceWithText(`${header}${sourceFile.getText()}` );
44-44
: Maintain consistency in async/sync operationsThe function mixes synchronous
saveSourceFile
operations with an asynchronousproject.save()
. Consider making all operations consistently asynchronous for better maintainability.- project.getSourceFiles().forEach(saveSourceFile); - await project.save(); + await Promise.all(project.getSourceFiles().map(async (file) => await saveSourceFile(file))); + await project.save();packages/schema/src/language-server/validator/schema-validator.ts (1)
39-40
: Consider using a more descriptive variable name.The variable name
decls
could be more descriptive to better reflect that it contains both data models and type definitions.- const decls = getDataModelAndTypeDefs(model, true); - const authModels = decls.filter((d) => isDataModel(d) && hasAttribute(d, '@@auth')); + const modelAndTypeDefs = getDataModelAndTypeDefs(model, true); + const authModels = modelAndTypeDefs.filter((d) => isDataModel(d) && hasAttribute(d, '@@auth'));packages/schema/src/plugins/enhancer/enhance/model-typedef-generator.ts (1)
20-24
: Consider updating documentation about nullable optional fields.Since this change affects type generation behavior for all optional fields, consider updating the documentation to explicitly mention that optional fields are treated as nullable in generated TypeScript definitions. This will help users understand the type system behavior and avoid potential confusion.
tests/regression/tests/issue-1859.test.ts (1)
24-38
: Improve type safety and error handling in the custom methodWhile the test functionality is correct, consider these improvements:
- Replace
any
with proper Prisma types for args- Add error handling
- Define return type interface
+ type ListViewResponse<T> = { + view: boolean; + data: T[]; + }; + const extended = db.$extends({ model: { post: { - findManyListView: async (args: any) => { - return { view: true, data: await db.post.findMany(args) }; + findManyListView: async ( + args?: Parameters<typeof db.post.findMany>[0] + ): Promise<ListViewResponse<Post>> => { + try { + const data = await db.post.findMany(args); + return { view: true, data }; + } catch (error) { + throw new Error(`Failed to fetch posts: ${error.message}`); + } }, }, }, });tests/regression/tests/issue-prisma-extension.test.ts (5)
27-29
: Consider using a more specific type instead of 'any'The
args
parameter could benefit from a more specific type definition, likelyPrisma.PostFindManyArgs
.- findManyListView: async (args: any) => { + findManyListView: async (args: Prisma.PostFindManyArgs) => {
34-39
: Enhance test coverage with additional assertionsConsider adding assertions for:
- Error cases (e.g., invalid args)
- Edge cases (e.g., empty result set)
- Data integrity (verify all expected fields are present)
58-68
: Improve type safety in extension definitionThe extension definition uses 'any' type which could be replaced with proper Prisma types.
- const ext = prismaModule.defineExtension((_prisma: any) => { + const ext = prismaModule.defineExtension((_prisma: typeof prisma) => {
81-82
: Enhance the behavioral difference documentationThe current comment could be more detailed to explain why the un-enhanced client is used internally and its implications.
- // findManyListView internally uses the un-enhanced client + // Note: findManyListView internally uses the un-enhanced client because + // it was defined with the original prisma client reference. + // This means it bypasses the enhancement's permission checks.
3-90
: Consider splitting the test cases for better organizationWhile the current structure works, consider organizing these tests into separate describe blocks for enhanced clients and extended clients. This would make it easier to add more related test cases in the future and improve test organization.
Example structure:
describe('issue prisma extension', () => { describe('enhanced clients', () => { it('extend enhanced client', ...); // future related tests }); describe('extended clients', () => { it('enhance extended client', ...); // future related tests }); });tests/regression/tests/issue-1843.test.ts (2)
93-95
: Enhance test assertions with specific expectationsThe current assertions only check if the operations resolve successfully. Consider adding more specific checks:
- Verify the created post's properties
- Ensure relationships are properly established
- Add negative test cases for invalid scenarios
Example improvement:
-await expect( - db.postWithCoauthor.create({ data: { title: 'new post', coauthor: { connect: { id: user.id } } } }) -).toResolveTruthy(); +const post = await db.postWithCoauthor.create({ + data: { title: 'new post', coauthor: { connect: { id: user.id } } } +}); +expect(post).toMatchObject({ + title: 'new post', + coauthorId: user.id, + published: false +});Also applies to: 98-100
103-106
: Add comprehensive Zod schema validation testsThe current Zod validation test is minimal and uses a hardcoded ID.
Consider adding:
- Validation tests for all required fields
- Tests for invalid data scenarios
- Use the actual user.id instead of hardcoded '1'
-zodSchemas.models.PostWithCoauthorCreateSchema.parse({ - title: 'new post', - coauthorId: '1', -}); +// Test valid data +expect(() => + zodSchemas.models.PostWithCoauthorCreateSchema.parse({ + title: 'new post', + coauthorId: user.id, + }) +).not.toThrow(); + +// Test invalid data +expect(() => + zodSchemas.models.PostWithCoauthorCreateSchema.parse({ + title: '', // empty title + coauthorId: 'invalid-id', + }) +).toThrow();tests/integration/tests/enhancements/json/validation.test.ts (1)
40-158
: Well-structured test cases with good coverage!The test cases are well-organized and cover important validation scenarios for JSON field typing. They effectively test both positive and negative cases for member access, collection access, and type-level attributes.
Consider adding these additional test cases for completeness:
- Nested JSON field access validation
- Multiple JSON fields in a single model
- Edge cases with empty collections
- Type validation for array elements
Example test case for nested JSON:
it('validates nested JSON field access in policy rules', async () => { await expect( loadModel(` type Address { street String city String } type Profile { address Address } model User { id Int @id @default(autoincrement()) profile Profile @json @@allow('all', auth().profile.address.city == 'New York') } `) ).toResolveTruthy(); });packages/plugins/tanstack-query/tests/portable.test.ts (4)
34-34
: Consider more specific version pinningUsing
5.56.x
for @tanstack/react-query could lead to inconsistent test behavior if patch versions introduce breaking changes.-'@tanstack/[email protected]' +'@tanstack/[email protected]'
52-110
: Ensure proper cleanup of temporary directoriesWhile
tmp.dirSync
is used withunsafeCleanup: true
, it's better to explicitly clean up in anafterAll
orafterEach
block to prevent test artifacts from persisting.Add cleanup code:
+let tempDir: tmp.DirResult; + +beforeEach(() => { + tempDir = tmp.dirSync({ unsafeCleanup: true }); +}); + +afterEach(() => { + tempDir.removeCallback(); +}); + it('supports portable for custom prisma client output', async () => { - const t = tmp.dirSync({ unsafeCleanup: true }); - const projectDir = t.name; + const projectDir = tempDir.name; // ... rest of the test });
112-152
: Enhance logical client test coverageThe current test only verifies basic inheritance. Consider adding test cases for:
- Multiple levels of inheritance
- Multiple child models
- Access to methods from parent model
Example additions:
model Base { id Int @id @default(autoincrement()) createdAt DateTime @default(now()) type String + baseField String @default("base") @@delegate(type) } model User extends Base { email String } + +model Admin extends User { + permissions String[] +}Also verify the generated hooks include inherited fields:
const { data } = useFindUniqueAdmin({ where: { id: 1 } }); console.log(data?.baseField); // Should be accessible console.log(data?.email); // Should be accessible
7-153
: Consider extracting common test utilitiesThe test file has repeated patterns for schema loading and verification. Consider creating helper functions to:
- Set up common schema parts
- Verify generated hooks
- Handle temporary directories
This would make the tests more maintainable and reduce duplication.Example structure:
function setupTestSchema(schema: string, options?: Partial<LoadSchemaOptions>) { const defaultOptions = { provider: 'postgresql', pushDb: false, extraDependencies: ['[email protected]', '@types/[email protected]', '@tanstack/[email protected]'], // ... other common options }; return loadSchema(schema, { ...defaultOptions, ...options }); } function verifyGeneratedHooks(projectDir: string) { // Common verification logic }packages/schema/src/plugins/prisma/index.ts (2)
84-95
: Consider enhancing error handling and resolution strategyWhile the type resolution logic is sound, consider these improvements:
- Include more specific error information in the warning message
- Add additional common paths to the resolution search
try { const prismaClientResolvedPath = require.resolve(clientOutputDir, { - paths: [path.dirname(options.schemaPath)], + paths: [ + path.dirname(options.schemaPath), + process.cwd(), + path.join(process.cwd(), 'node_modules'), + ], }); prismaClientDtsPath = path.join(path.dirname(prismaClientResolvedPath), 'index.d.ts'); } catch (err) { console.warn( colors.yellow( - `Could not resolve PrismaClient type declaration path. This may break plugins that depend on it.` + `Could not resolve PrismaClient type declaration path at "${clientOutputDir}". ` + + `This may break plugins that depend on it. Error: ${err instanceof Error ? err.message : 'Unknown error'}` ) ); }
Line range hint
31-110
: Consider documenting path resolution strategy for plugin developersThe changes introduce a more sophisticated path resolution strategy for Prisma client types. Consider:
- Documenting the path resolution strategy in the plugin development guide
- Adding examples of common configuration patterns
- Providing troubleshooting steps for type resolution issues
This will help plugin developers understand how to properly integrate with the type system.
packages/language/src/zmodel.langium (3)
184-189
: Track the TODO and verify attribute handling.The TODO suggests a significant architectural change to unify TypeDef and abstract DataModel. While the current implementation moves in that direction by allowing DataModelAttribute in TypeDef, we should ensure this is tracked.
Would you like me to create a GitHub issue to track the unification of TypeDef and abstract DataModel?
252-252
: Consider renaming DataModelAttribute.Since this attribute is used for both DataModel and TypeDef, a more generic name would be more appropriate. Consider alternatives like:
- EntityAttribute
- ModelTypeAttribute
- StructuralAttribute
Would you like me to:
- Create a GitHub issue to track the renaming task?
- Generate a comprehensive list of all occurrences that would need to be updated?
Line range hint
69-252
: Consider documenting the architectural evolution.The changes show a clear direction toward unifying the type system between DataModel and TypeDef. Consider adding a design document or ADR (Architecture Decision Record) to:
- Document the rationale behind the unification
- Track the incremental steps being taken
- Outline the end goal architecture
packages/schema/src/language-server/validator/function-invocation-validator.ts (1)
99-102
: Consider improving error message specificityThe error message could be more specific about what types of field references are valid in this context, especially since the function name change suggests it might handle different types of fields.
- if (!getFieldReference(firstArg)) { - accept('error', 'first argument must be a field reference', { node: firstArg }); + if (!getFieldReference(firstArg)) { + accept('error', 'first argument must be a valid field reference (e.g., model field, type field)', { node: firstArg });packages/runtime/src/enhancements/node/query-utils.ts (3)
236-238
: Add JSDoc return type documentation.The method documentation should specify the return type structure for better developer experience.
/** * Gets relation info for a foreign key field. + * @param model - The model name to look up + * @param fkField - The foreign key field name to find relations for + * @returns An object containing the relation field, ID field, and foreign key field if found, undefined otherwise */
239-255
: Add explicit return type and interface.The method would benefit from explicit typing of its return value for better type safety and documentation.
+interface RelationInfo { + relation: FieldInfo; + idField: string; + fkField: string; +} -getRelationForForeignKey(model: string, fkField: string) { +getRelationForForeignKey(model: string, fkField: string): RelationInfo | undefined {
245-252
: Consider performance optimization for large models.The current implementation iterates through all fields to find a matching foreign key. For models with many fields, this could be inefficient.
Consider creating an index map of foreign key fields to their relations during model initialization, trading memory for lookup speed if this method is frequently called.
Example approach:
private fkFieldMap: Map<string, Map<string, RelationInfo>> = new Map(); private initFkFieldMap() { for (const [modelName, info] of Object.entries(this.options.modelMeta)) { const modelMap = new Map<string, RelationInfo>(); for (const field of Object.values(info.fields)) { if (field.foreignKeyMapping) { for (const [idField, fkField] of Object.entries(field.foreignKeyMapping)) { modelMap.set(fkField, { relation: field, idField, fkField }); } } } if (modelMap.size > 0) { this.fkFieldMap.set(modelName, modelMap); } } }packages/schema/src/utils/ast-utils.ts (1)
263-269
: LGTM with a minor documentation suggestionThe implementation correctly handles both DataModel and TypeDef entities. Consider enhancing the documentation to specify the return type and mention that it returns an array of both DataModel and TypeDef instances.
/** - * Gets all data models and type defs from all loaded documents + * Gets all data models and type defs from all loaded documents + * @returns Array<DataModel | TypeDef> An array containing all data models and type definitions */tests/integration/tests/enhancements/json/typing.test.ts (2)
339-343
: Add documentation to the Content typeThe
Content
type implements a recursive structure but lacks documentation explaining its purpose and usage.Consider adding documentation:
+/// Represents a hierarchical content structure that can contain nested content blocks +/// @example +/// { +/// type: "text", +/// content: [{ type: "text", text: "hello" }] +/// } type Content { type String content Content[]? text String? }
362-380
: Enhance test coverage for recursive typesThe test demonstrates a happy path but should also verify error cases and edge conditions.
Consider adding these test cases:
// Test maximum nesting depth const deepContent: Content = { type: 'text', content: Array(100).fill(null).map(() => ({ type: 'text' })) }; // Test empty content array const emptyContent: Content = { type: 'text', content: [] }; // Test missing optional fields const minimalContent: Content = { type: 'text' };Also, add assertions to verify the structure after retrieval:
const post = await db.post.create({ data: { content } }); -console.log(post.content.content?.[0].content?.[0].text); +expect(post.content.content?.[0].content?.[0].text).toBe('hello'); +expect(post.content.type).toBe('text'); +expect(post.content.content?.length).toBe(1);packages/runtime/src/enhancements/node/proxy.ts (1)
257-274
: Consider adding type safety improvements.While the implementation is functionally correct, consider adding type safety improvements:
- if (prop === '$extends') { + if (prop === '$extends' as keyof typeof target) { const $extends = Reflect.get(target, prop, receiver); - if ($extends && typeof $extends === 'function') { + if ($extends && typeof $extends === 'function' && !('then' in $extends)) { return (...args: any[]) => { const result = $extends.bind(target)(...args); - if (!result[PRISMA_PROXY_ENHANCER]) { + if (result && typeof result === 'object' && !result[PRISMA_PROXY_ENHANCER]) {These changes would:
- Add type safety for the property access
- Ensure
$extends
isn't a Promise (which would also be a function)- Add null/undefined checks for the result
packages/schema/src/language-server/validator/attribute-application-validator.ts (1)
72-74
: LGTM! Consider adding test coverage.The new validation logic for preventing unsupported attributes on type declarations is well-implemented and consistent with the existing validation patterns. The error message is clear and follows the established style.
Consider adding a test case to verify this validation, ensuring it:
- Fails when applying an attribute without @@@supportTypeDef to a type declaration
- Passes when applying an attribute with @@@supportTypeDef to a type declaration
tests/integration/tests/enhancements/json/crud.test.ts (1)
194-232
: Consider adding edge cases and improving error messagesThe test case provides good coverage of validation rules, but could be enhanced with:
- Edge cases around the age boundary (age=18)
- More descriptive error messages in expect statements to clarify failure reasons
Consider adding these test cases:
expect(schema.safeParse({ age: 10, address: { city: 'NY' } })).toMatchObject({ success: false }); +expect(schema.safeParse({ age: 18, address: { city: 'NY' } })).toMatchObject({ + success: false, + error: expect.objectContaining({ message: expect.stringContaining('age must be greater than 18') }) +}); expect(schema.safeParse({ age: 20, address: { city: 'NYC' } })).toMatchObject({ success: false });packages/schema/src/cli/plugin-runner.ts (1)
Line range hint
139-185
: Consider adding tests for the new path handlingThe addition of
prismaClientDtsPath
warrants test coverage to ensure proper handling across different plugin scenarios.Would you like me to help generate test cases for:
- Core plugin path propagation
- User plugin option handling
- Edge cases (undefined paths, invalid paths)
packages/sdk/src/model-meta-generator.ts (1)
164-165
: Consider adding type validation for parameters.While the implementation is correct, consider adding type validation to ensure the arrays are not null/undefined before spreading them.
function writeAuthModel(writer: CodeBlockWriter, dataModels: DataModel[], typeDefs: TypeDef[]) { + if (!dataModels || !typeDefs) { + throw new Error('dataModels and typeDefs must be defined'); + } const authModel = getAuthDecl([...dataModels, ...typeDefs]); if (authModel) { writer.writeLine(`authModel: '${authModel.name}'`); } }packages/schema/src/plugins/enhancer/policy/utils.ts (1)
541-553
: Consider optimizing the enum reference check.The implementation correctly checks enum references across all data models. However, we could optimize it by using
some
for early termination when a reference is found.Consider this optimization:
export function isEnumReferenced(model: Model, decl: Enum): unknown { const dataModels = getDataModels(model); return dataModels.some((dm) => { - return streamAllContents(dm).some((node) => { + return streamAllContents(dm).some((node) => if (isDataModelField(node) && node.type.reference?.ref === decl) { - // referenced as field type - return true; + return true; // referenced as field type } if (isEnumFieldReference(node) && node.target.ref?.$container === decl) { - // enum field is referenced - return true; + return true; // enum field is referenced } - return false; - }); + return false + ); }); }tests/integration/tests/enhancements/with-policy/auth.test.ts (3)
871-885
: Consider enhancing type definitions with additional validation and typing.The type definitions are well-structured, but could be improved for better type safety and validation:
type Profile { - age Int + age Int @validate(min: 0, max: 150) } type Role { name String - permissions String[] + permissions Permission[] } +enum Permission { + CREATE + READ + UPDATE + DELETE +}
887-893
: Consider extracting complex authorization rules into named policies.The authorization rules are comprehensive but could be more maintainable if extracted into named policies:
model Foo { id Int @id @default(autoincrement()) - @@allow('read', true) - @@allow('create', auth().myId == 1 && !auth().banned) - @@allow('delete', auth().roles?['DELETE' in permissions]) - @@deny('all', auth().profile.age < 18) + @@allow('read', true) + @@allow('create', canCreate()) + @@allow('delete', hasDeletePermission()) + @@deny('all', isUnderAge()) } +// Named policies for better maintainability +policy canCreate() { + auth().myId == 1 && !auth().banned +} + +policy hasDeletePermission() { + auth().roles?['DELETE' in permissions] +} + +policy isUnderAge() { + auth().profile.age < 18 +}
868-922
: Consider adding test cases for edge cases and error scenarios.While the test coverage is good, consider adding these scenarios:
// Test edge cases for age restriction await expect( enhance({ myId: 1, profile: { age: 18 } }).foo.create({ data: {} }) ).toResolveTruthy(); // Test multiple roles with overlapping permissions await expect( enhance({ myId: 1, profile: { age: 20 }, roles: [ { name: 'ADMIN', permissions: ['CREATE'] }, { name: 'MANAGER', permissions: ['DELETE'] } ] }).foo.delete({ where: { id: r.id }, }) ).toResolveTruthy(); // Test handling of undefined optional fields await expect( enhance({ myId: 1, profile: { age: 20 }, roles: undefined }) .foo.delete({ where: { id: r.id } }) ).toBeRejectedByPolicy();packages/schema/src/plugins/prisma/schema-generator.ts (2)
Line range hint
1-24
: Consider refactoring for better maintainability and type safety.The field generation logic could be improved in several ways:
- Extract the attribute filtering logic into separate methods
- Add more descriptive error messages
- Consider using type guards for better type safety
Consider this refactoring:
private generateModelField(model: PrismaDataModel, field: DataModelField, addToFront = false) { + const fieldType = this.resolveFieldType(field); + if (!fieldType) { + throw new PluginError( + name, + `Unable to resolve field type for ${field.$container.name}.${field.name}. Please ensure the type is properly defined.` + ); + } + + const attributes = this.resolveFieldAttributes(field); + const docs = this.resolveFieldDocs(field); + + return model.addField(field.name, fieldType, attributes, docs, addToFront); } + +private resolveFieldType(field: DataModelField): ModelFieldType | undefined { + if (field.type.type) { + return new ModelFieldType(field.type.type, field.type.array, field.type.optional); + } + + if (field.type.reference?.ref) { + if (isTypeDef(field.type.reference.ref)) { + this.ensureSupportingTypeDefFields(this.zmodel); + return new ModelFieldType('Json', false, field.type.optional); + } + return new ModelFieldType( + field.type.reference.ref.name, + field.type.array, + field.type.optional + ); + } + + const unsupported = this.getUnsupportedFieldType(field.type); + return unsupported ? new ModelFieldType(unsupported, false, field.type.optional) : undefined; }
Line range hint
12-24
: Optimize name truncation for better performance.The current implementation of name truncation could be optimized by:
- Caching truncated names
- Using a more efficient string manipulation approach
Consider this optimization:
+private nameCache = new Map<string, string>(); + private truncate(name: string) { + if (this.nameCache.has(name)) { + return this.nameCache.get(name)!; + } + if (name.length <= IDENTIFIER_NAME_MAX_LENGTH) { + this.nameCache.set(name, name); return name; } - - const existing = this.shortNameMap.get(name); - if (existing) { - return existing; - }packages/plugins/openapi/src/rest-generator.ts (1)
Line range hint
1-1024
: Consider documenting JSON:API compliance.The generator implements the JSON:API specification (evident from response structures, relationship handling, etc.). Consider adding explicit documentation about this compliance to help API consumers understand the implementation details and available features.
Add a class-level JSDoc comment:
/** * Generates OpenAPI specification following the JSON:API specification. * @see https://jsonapi.org/ */ export class RESTfulOpenAPIGenerator extends OpenAPIGeneratorBase {packages/schema/src/plugins/zod/utils/schema-gen.ts (5)
Line range hint
38-44
: Correct conditional checks to include zero values in '@Length' validationsIn the '@Length' attribute handling, the conditions
if (min)
andif (max)
will skip validations whenmin
ormax
is zero because zero is a falsy value in JavaScript. This can lead to missing validations when zero is a valid constraint.Please update the conditions to explicitly check for
undefined
:- if (min) { + if (min !== undefined) { schema += `.min(${min}${messageArg})`; } - if (max) { + if (max !== undefined) { schema += `.max(${max}${messageArg})`; }
Line range hint
49-51
: Ensure empty strings are handled correctly in '@contains' validationThe current condition
if (expr)
will skip the.includes()
validation ifexpr
is an empty string since empty strings are falsy in JavaScript. If an empty string is a valid value for the'@contains'
attribute, this validation will not be added.Please modify the condition to check for
undefined
explicitly:- if (expr) { + if (expr !== undefined) { schema += `.includes(${JSON.stringify(expr)}${messageArg})`; }
Line range hint
54-56
: Ensure empty strings are handled correctly in '@startsWith' validationSimilar to the '@contains' case, the condition
if (text)
will skip the.startsWith()
validation iftext
is an empty string. This could result in missing validations when an empty string is a valid prefix.Update the condition to explicitly check for
undefined
:- if (text) { + if (text !== undefined) { schema += `.startsWith(${JSON.stringify(text)}${messageArg})`; }
Line range hint
59-61
: Ensure empty strings are handled correctly in '@ENDSWITH' validationThe condition
if (text)
in the '@ENDSWITH' attribute handling may skip validations whentext
is an empty string. This could lead to unintended behavior if empty strings are valid suffixes.Modify the condition to explicitly check for
undefined
:- if (text) { + if (text !== undefined) { schema += `.endsWith(${JSON.stringify(text)}${messageArg})`; }
Line range hint
82-92
: Handle potential exceptions in 'refineDecimal' functionIn the
refineDecimal
function, theDecimal
constructor could throw exceptions if the input is invalid. While thetry...catch
block handles exceptions, consider logging or providing more context in the error handling to aid in debugging.Optionally, you might log the error or provide a custom error message:
} catch (error) { - return false; + console.error('Decimal refinement error:', error); + return false; }packages/schema/src/language-server/zmodel-scope.ts (3)
156-159
: Simplify conditions to reduce code duplicationThe conditions checking for
isDataModelField
andisTypeDefField
can be combined to improve readability and reduce duplication.Apply this diff to combine the conditions:
- if (isDataModelField(ref) && !ref.type.array) { - return this.createScopeForContainer(ref.type.reference?.ref, globalScope, allowTypeDefScope); - } - if (isTypeDefField(ref) && !ref.type.array) { - return this.createScopeForContainer(ref.type.reference?.ref, globalScope, allowTypeDefScope); - } + if ((isDataModelField(ref) || isTypeDefField(ref)) && !ref.type.array) { + return this.createScopeForContainer(ref.type.reference?.ref, globalScope, allowTypeDefScope); + }
194-195
: Combine conditions to streamline codeSimilar to previous suggestions, combining the conditions for
isDataModelField
andisTypeDefField
reduces redundancy.Apply this diff:
- if (isDataModelField(ref) || isTypeDefField(ref)) { - return this.createScopeForContainer(ref.type.reference?.ref, globalScope, allowTypeDefScope); - } + if ((isDataModelField(ref) || isTypeDefField(ref))) { + return this.createScopeForContainer(ref.type.reference?.ref, globalScope, allowTypeDefScope); + }Also applies to: 202-203
156-159
: Consider creating a helper function for repeated logicThe logic for checking
isDataModelField
orisTypeDefField
and callingcreateScopeForContainer
is repeated across several methods. Extracting this into a helper function could enhance maintainability.Also applies to: 194-195, 202-203
packages/runtime/src/cross/mutator.ts (4)
46-46
: Avoid unnecessaryreturn await
in async functionUsing
return await
in an async function is redundant because the function will automatically return a promise. Removingawait
improves performance by avoiding an extra microtask.Apply this diff to simplify the return statement:
-export async function applyMutation( +export function applyMutation( queryModel: string, queryOp: string, queryData: any, mutationModel: string, mutationOp: PrismaWriteActionType, mutationArgs: any, modelMeta: ModelMeta, logging: boolean ) { - return await doApplyMutation(queryModel, queryData, mutationModel, mutationOp, mutationArgs, modelMeta, logging); + return doApplyMutation(queryModel, queryData, mutationModel, mutationOp, mutationArgs, modelMeta, logging); }Note: Since
applyMutation
now returns a promise directly, you can remove theasync
keyword.
49-57
: Consider adding documentation fordoApplyMutation
Adding JSDoc comments to the new
doApplyMutation
function would improve code readability and maintainability by explaining its purpose, parameters, and return value.
155-159
: Avoid unnecessary cloning of result dataCloning
resultData
inside the loop can be inefficient. Consider cloning the array once before the loop if any updates are expected.Apply this diff to optimize the cloning process:
for (let i = 0; i < resultData.length; i++) { const item = resultData[i]; // ... existing checks ... if (r) { - if (!arrayCloned) { - resultData = [...resultData]; - arrayCloned = true; - } + resultData = resultData.slice(); // Clone the array once before applying updates resultData[i] = r; updated = true; } }
263-270
: ValidatemutateArgs
more robustlyEnsure that
mutateArgs
has the necessary properties before proceeding to avoid runtime errors.Consider combining the checks for better readability:
-if (!mutateArgs?.where || typeof mutateArgs.where !== 'object') { +if (!mutateArgs?.where || !mutateArgs?.data || typeof mutateArgs.where !== 'object' || typeof mutateArgs.data !== 'object') { return undefined; } - -if (!mutateArgs?.data || typeof mutateArgs.data !== 'object') { - return undefined; -}packages/schema/src/language-server/zmodel-linker.ts (1)
521-521
: Consider adding inline comments for clarity inresolveToDeclaredType
.With the added complexity of handling multiple types, including
TypeDefFieldType
, adding brief comments withinresolveToDeclaredType
can improve readability by explaining the logic behind the type checks and assignments.packages/plugins/openapi/src/rpc-generator.ts (3)
3-4
: Ensure All Imported Modules Are UtilizedThe imports now include
PluginError
,PluginOptions
,analyzePolicies
,requireOption
, andresolvePath
. Verify that all these modules are used within the code to prevent unnecessary dependencies.
42-45
: SimplifyomitInputDetails
Validation LogicConsider simplifying the validation by directly checking the type of
omitInputDetails
.Apply this diff to streamline the validation:
this.omitInputDetails = this.getOption<boolean>('omitInputDetails', false); -if (this.omitInputDetails !== undefined && typeof this.omitInputDetails !== 'boolean') { +if (typeof this.omitInputDetails !== 'boolean') { throw new PluginError(name, `Invalid option value for "omitInputDetails", boolean expected`); }
849-849
: Preservedescription
Parameter inomittableRef
In the
omittableRef
method, consider passing thedescription
parameter when returningthis.ref(ANY_OBJECT)
to maintain any provided descriptions for the schema reference.Apply this diff to include the
description
parameter:return this.ref(ANY_OBJECT); +return this.ref(ANY_OBJECT, rooted, description);
packages/schema/tests/schema/validation/attribute-validation.test.ts (1)
Line range hint
23-71
: Refactor repetitive test cases to enhance maintainability.The
field attribute type checking
test contains multiple similar scenarios checking for specific error messages when invalid attribute configurations are provided. Consider refactoring these into a parameterized test or looping over an array of test inputs and expected errors. This will reduce code duplication and improve readability.packages/runtime/src/enhancements/node/delegate.ts (1)
Line range hint
590-633
: Prevent Mutatingargs
During Iteration inensureBaseCreateHierarchy
In the
ensureBaseCreateHierarchy
method, when checkinghasDelegateBase
and callingthis.fkAssignmentToConnect(model, args);
, ensure that mutatingargs
within iteration doesn't lead to unexpected behavior.Consider iterating over a copy of the keys to prevent issues caused by adding or deleting properties during iteration:
- this.fkAssignmentToConnect(model, args); + const argsCopy = { ...args }; + this.fkAssignmentToConnect(model, argsCopy);This approach avoids side effects that could occur if
args
is modified during the execution offkAssignmentToConnect
.packages/plugins/tanstack-query/tests/react-hooks-v5.test.tsx (9)
287-288
: Use explicit typing instead ofany[]
fordata
Defining a specific interface or type for
data
enhances type safety and code readability. Consider creating an interface for the user data structure.
351-352
: Use explicit typing instead ofany[]
fordata
Defining specific types for
data
rather than usingany[]
can improve maintainability and leverage TypeScript's type-checking capabilities.
596-598
: Use explicit typing instead of implicitany
fordata
Assigning an explicit type to
data
promotes better type safety. Define an interface representing the structure ofdata
.
768-769
: Prefer explicit types overany
fordata
Using specific types for
data
enhances code clarity and helps prevent runtime errors. Consider defining an interface for the data structure.
833-834
: Use specific types instead ofany
fordata
Explicitly typing
data
improves type safety and code readability. Define an interface to represent the posts data structure.
950-951
: Define explicit types fordata
instead of usingany
Providing a specific type for
data
ensures type correctness and enhances maintainability. Create an interface to represent the user and posts structure.
1015-1016
: Avoid usingany
type fordata
Replacing
any
with a defined type fordata
leverages TypeScript's strengths and improves code quality.
1174-1175
: Use explicit type definitions fordata
Defining a specific type for
data
enhances type safety and makes the code more self-documenting.
1236-1239
: Provide explicit types for thedata
arrayDefining an interface for the elements within
data
improves type checking and code clarity.packages/schema/src/plugins/zod/transformer.ts (1)
355-374
: Consider adding unit tests forisFieldOptional
method.To ensure the robustness of the
isFieldOptional
logic across various scenarios, it would be beneficial to implement unit tests that cover different field configurations, including default attributes and relation fields.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
⛔ Files ignored due to path filters (20)
package.json
is excluded by!**/*.json
packages/ide/jetbrains/package.json
is excluded by!**/*.json
packages/language/package.json
is excluded by!**/*.json
packages/language/src/generated/ast.ts
is excluded by!**/generated/**
,!**/generated/**
packages/language/src/generated/grammar.ts
is excluded by!**/generated/**
,!**/generated/**
packages/misc/redwood/package.json
is excluded by!**/*.json
packages/plugins/openapi/package.json
is excluded by!**/*.json
packages/plugins/openapi/tests/baseline/rest-3.0.0.baseline.yaml
is excluded by!**/*.yaml
packages/plugins/openapi/tests/baseline/rest-3.1.0.baseline.yaml
is excluded by!**/*.yaml
packages/plugins/openapi/tests/baseline/rpc-3.0.0-omit.baseline.yaml
is excluded by!**/*.yaml
packages/plugins/openapi/tests/baseline/rpc-3.1.0-omit.baseline.yaml
is excluded by!**/*.yaml
packages/plugins/swr/package.json
is excluded by!**/*.json
packages/plugins/tanstack-query/package.json
is excluded by!**/*.json
packages/plugins/trpc/package.json
is excluded by!**/*.json
packages/runtime/package.json
is excluded by!**/*.json
packages/schema/package.json
is excluded by!**/*.json
packages/sdk/package.json
is excluded by!**/*.json
packages/sdk/src/package.json
is excluded by!**/*.json
packages/server/package.json
is excluded by!**/*.json
packages/testtools/package.json
is excluded by!**/*.json
📒 Files selected for processing (61)
packages/ide/jetbrains/CHANGELOG.md
(1 hunks)packages/ide/jetbrains/build.gradle.kts
(1 hunks)packages/language/src/zmodel.langium
(4 hunks)packages/plugins/openapi/src/rest-generator.ts
(3 hunks)packages/plugins/openapi/src/rpc-generator.ts
(19 hunks)packages/plugins/openapi/tests/openapi-rpc.test.ts
(2 hunks)packages/plugins/swr/src/generator.ts
(1 hunks)packages/plugins/tanstack-query/src/generator.ts
(6 hunks)packages/plugins/tanstack-query/tests/portable.test.ts
(1 hunks)packages/plugins/tanstack-query/tests/react-hooks-v5.test.tsx
(7 hunks)packages/plugins/tanstack-query/tests/test-model-meta.ts
(1 hunks)packages/plugins/trpc/src/client-helper/index.ts
(0 hunks)packages/plugins/trpc/src/generator.ts
(0 hunks)packages/plugins/trpc/tests/projects/nuxt-trpc-v10/prisma/schema.prisma
(0 hunks)packages/plugins/trpc/tests/projects/nuxt-trpc-v11/prisma/schema.prisma
(0 hunks)packages/plugins/trpc/tests/projects/t3-trpc-v11/prisma/schema.prisma
(0 hunks)packages/runtime/src/cross/mutator.ts
(9 hunks)packages/runtime/src/enhancements/node/delegate.ts
(2 hunks)packages/runtime/src/enhancements/node/proxy.ts
(1 hunks)packages/runtime/src/enhancements/node/query-utils.ts
(1 hunks)packages/schema/src/cli/cli-util.ts
(3 hunks)packages/schema/src/cli/plugin-runner.ts
(3 hunks)packages/schema/src/language-server/utils.ts
(2 hunks)packages/schema/src/language-server/validator/attribute-application-validator.ts
(1 hunks)packages/schema/src/language-server/validator/function-invocation-validator.ts
(2 hunks)packages/schema/src/language-server/validator/schema-validator.ts
(2 hunks)packages/schema/src/language-server/validator/typedef-validator.ts
(1 hunks)packages/schema/src/language-server/validator/utils.ts
(1 hunks)packages/schema/src/language-server/zmodel-linker.ts
(7 hunks)packages/schema/src/language-server/zmodel-scope.ts
(5 hunks)packages/schema/src/plugins/enhancer/enhance/auth-type-generator.ts
(4 hunks)packages/schema/src/plugins/enhancer/enhance/index.ts
(13 hunks)packages/schema/src/plugins/enhancer/enhance/model-typedef-generator.ts
(1 hunks)packages/schema/src/plugins/enhancer/index.ts
(2 hunks)packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts
(0 hunks)packages/schema/src/plugins/enhancer/policy/utils.ts
(3 hunks)packages/schema/src/plugins/plugin-utils.ts
(2 hunks)packages/schema/src/plugins/prisma/index.ts
(3 hunks)packages/schema/src/plugins/prisma/schema-generator.ts
(1 hunks)packages/schema/src/plugins/zod/generator.ts
(7 hunks)packages/schema/src/plugins/zod/transformer.ts
(7 hunks)packages/schema/src/plugins/zod/utils/schema-gen.ts
(1 hunks)packages/schema/src/res/stdlib.zmodel
(3 hunks)packages/schema/src/utils/ast-utils.ts
(2 hunks)packages/schema/tests/schema/validation/attribute-validation.test.ts
(1 hunks)packages/sdk/src/code-gen.ts
(2 hunks)packages/sdk/src/model-meta-generator.ts
(3 hunks)packages/sdk/src/types.ts
(2 hunks)packages/sdk/src/utils.ts
(6 hunks)tests/integration/tests/enhancements/json/crud.test.ts
(2 hunks)tests/integration/tests/enhancements/json/typing.test.ts
(1 hunks)tests/integration/tests/enhancements/json/validation.test.ts
(2 hunks)tests/integration/tests/enhancements/with-policy/auth.test.ts
(1 hunks)tests/integration/tests/enhancements/with-policy/client-extensions.test.ts
(9 hunks)tests/regression/tests/issue-1835.test.ts
(1 hunks)tests/regression/tests/issue-1843.test.ts
(1 hunks)tests/regression/tests/issue-1849.test.ts
(1 hunks)tests/regression/tests/issue-1857.test.ts
(1 hunks)tests/regression/tests/issue-1859.test.ts
(1 hunks)tests/regression/tests/issue-756.test.ts
(1 hunks)tests/regression/tests/issue-prisma-extension.test.ts
(1 hunks)
💤 Files with no reviewable changes (6)
- packages/plugins/trpc/src/client-helper/index.ts
- packages/plugins/trpc/src/generator.ts
- packages/plugins/trpc/tests/projects/nuxt-trpc-v10/prisma/schema.prisma
- packages/plugins/trpc/tests/projects/nuxt-trpc-v11/prisma/schema.prisma
- packages/plugins/trpc/tests/projects/t3-trpc-v11/prisma/schema.prisma
- packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts
🔇 Additional comments (163)
tests/regression/tests/issue-1849.test.ts (1)
7-7
: Verify the enum import path resolution
The relative import path './enum' might need verification to ensure it works consistently across different test environments.
tests/regression/tests/issue-1835.test.ts (2)
23-24
: LGTM: Appropriate test configuration
The PostgreSQL provider with pushDb: false
is a good configuration for regression testing as it prevents unintended side effects on the database.
12-16
: Verify the necessity of @@ignore
directive
The Model
includes the @@ignore
directive, but it's not clear why this model needs to be ignored. This could potentially hide issues if the model is meant to be used elsewhere.
Let's check if this model is referenced elsewhere in the codebase:
✅ Verification successful
@@ignore
directive is necessary for testing issue #1835
The @@ignore
directive is intentionally used here to test a specific bug (issue #1835) where ignoring a model that references an enum causes enhanced client generation to fail. This is a regression test that verifies the fix for this issue, so the @@ignore
directive is an essential part of the test case.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for any references to this model in the codebase
rg -l "model\s+Model\s*{" --type ts
Length of output: 135
Script:
#!/bin/bash
# Search for any references to "Model" in the codebase
rg -l "Model" --type ts
# Also check the specific test file content to understand the context
cat tests/regression/tests/issue-1835.test.ts
Length of output: 9776
Script:
#!/bin/bash
# Let's check the issue #1835 to understand the context
gh issue view 1835 || true
# Also search for any other test files that use @@ignore
rg -A 2 "@@ignore" --type ts tests/
Length of output: 2868
tests/regression/tests/issue-756.test.ts (1)
31-31
: Verify other tests affected by the error message change
The change from "DataModelField" to "MemberAccessTarget" in the error message reflects an architectural change in member access handling. Let's ensure all related tests are updated consistently.
✅ Verification successful
No changes needed in other tests
The error message change from "DataModelField" to "MemberAccessTarget" appears to be consistent with the current codebase. From the search results:
- The test in
issue-1064.test.ts
doesn't contain any error message assertions related to member access. - Other tests in the codebase are already using the new error message format:
validation.test.ts
uses "MemberAccessTarget" in its assertionsissue-925.test.ts
uses "ReferenceTarget" which is another variant of the same error message pattern
The change in issue-756.test.ts
is correct and aligns with the current error message format used throughout the codebase.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Find other tests that might need similar updates
# Look for tests expecting the old "DataModelField" error message
rg -l "DataModelField" tests/
# Look for similar access policy tests
rg -l "auth\(\).*authorId" tests/
Length of output: 148
Script:
#!/bin/bash
# Let's examine the content of issue-1064.test.ts to see if it needs similar updates
cat tests/regression/tests/issue-1064.test.ts
# Also, let's check for any other occurrences of similar error messages in the codebase
rg "Could not resolve reference to" -A 2 tests/
Length of output: 11362
packages/schema/src/language-server/utils.ts (2)
3-3
: LGTM: New imports are properly organized and utilized
The added imports are well-organized alongside related imports and are effectively utilized in the new functionality.
Also applies to: 5-5, 6-6
35-36
: Verify consistent usage across the codebase
Let's ensure this new type guard is consistently used where appropriate, particularly in ZModelLinker and SchemaValidator.
✅ Verification successful
Based on the shell script results, I can now generate the final response:
Consistent usage of isMemberContainer
verified
The new isMemberContainer
type guard is correctly used in the codebase, particularly in the language server components where type resolution and member access need to handle both DataModel and TypeDef cases. Key findings:
- Used appropriately in
zmodel-linker.ts
for member access resolution - Existing code in
sdk/utils.ts
already uses the combined check pattern (isDataModel(d) || isTypeDef(d)
) - No instances found where
isDataModel
checks should be replaced withisMemberContainer
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check usage patterns of isDataModel and isMemberContainer
# to ensure consistent adoption of the new type guard
# Check for potential places where isMemberContainer should be used
# Look for isDataModel usage that might need to be updated
rg "isDataModel\(" -A 3 -B 3
# Look for existing usage of isMemberContainer
rg "isMemberContainer\(" -A 3 -B 3
# Check for TypeDef handling that might benefit from isMemberContainer
ast-grep --pattern 'isTypeDef($_)'
Length of output: 51126
packages/schema/src/language-server/validator/typedef-validator.ts (2)
13-13
: LGTM: Logical placement of attribute validation
The new validation call is well-positioned in the validation sequence, occurring after structural validation (duplicates) but before field validation.
17-19
: Consider adding defensive coding and type-specific validation
The implementation is clean and follows established patterns, but could be enhanced with:
- Null/undefined checks for the attributes array
- Specific validation rules for attributes that are valid in the TypeDef context
Consider this enhancement:
private validateAttributes(typeDef: TypeDef, accept: ValidationAcceptor) {
+ if (!typeDef.attributes?.length) {
+ return;
+ }
typeDef.attributes.forEach((attr) => validateAttributeApplication(attr, accept));
}
Let's check what attributes are currently being used with TypeDef:
tests/regression/tests/issue-1857.test.ts (2)
15-15
: Review permissive access control
The @@allow('all', true)
directive grants unrestricted access to the Post model. While this might be acceptable for testing, ensure this doesn't reflect production access patterns.
27-27
: Verify enhance import path
The relative import path .zenstack/enhance
might be sensitive to the execution context. Consider using the package import instead.
✅ Verification successful
Based on the search results, I can generate the final response as we have sufficient information about the enhance import patterns across the codebase.
Import path is consistent with test context
The relative import path .zenstack/enhance
is consistently used across test files, particularly in regression and integration tests. The package import @zenstackhq/runtime
is specifically used in framework-specific test projects. Since this file is a regression test, the current import path follows the established pattern.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check how enhance is typically imported in other files
rg "import.*enhance.*from" --type ts
Length of output: 3091
packages/ide/jetbrains/CHANGELOG.md (2)
7-7
: LGTM! Clear and descriptive changelog entry
The changelog entry clearly describes the bug fix related to validation errors in strongly typed JSON fields when using multi-file schema setup. This aligns well with the changes described in the PR summary.
5-8
:
Update version number for the release
Since this PR is merging v2.9.0 to main, the "Unreleased" section should be updated to "2.9.0" to reflect the version being released.
Apply this diff:
-## [Unreleased]
+## [2.9.0]
Likely invalid or redundant comment.
packages/schema/src/plugins/enhancer/index.ts (1)
47-47
: LGTM! Verify type definitions are properly referenced
The addition of prismaClientDtsPath
to the return object is correct and maintains type safety.
Let's verify the type definitions are properly referenced:
✅ Verification successful
Based on the results, I can now generate the final response:
Type definitions and usage of prismaClientDtsPath
are properly referenced
The verification shows that:
- The type is properly defined as an optional string in the interfaces in
packages/sdk/src/types.ts
- It's consistently used across the codebase:
- Generated in
plugins/prisma/index.ts
- Passed through in
plugins/enhancer/index.ts
- Correctly consumed in
plugins/tanstack-query/src/generator.ts
- Generated in
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check if prismaClientDtsPath is properly typed and used consistently
# Check type definitions
ast-grep --pattern 'interface $interface {
$$$
prismaClientDtsPath?: $_
$$$
}'
# Check usage patterns
rg -A 2 'prismaClientDtsPath'
Length of output: 3042
packages/plugins/tanstack-query/tests/test-model-meta.ts (1)
48-56
: LGTM! Improved relationship definition between Post and User models
The changes enhance the relationship definition by:
- Adding explicit foreign key mapping in the
owner
field - Correctly typing
ownerId
as 'String' to match the User's id type - Maintaining clear separation between relation and foreign key fields
This makes the relationship structure more explicit and type-safe.
packages/sdk/src/code-gen.ts (2)
1-1
: LGTM: Import change is consistent with new functionality
The addition of SourceFile
import aligns with its usage in the new saveSourceFile
function.
18-18
: Verify the implications of disabling implicit any checks
Setting noImplicitAny: false
reduces TypeScript's type safety. While this might be necessary for code generation, please ensure this won't lead to type-related issues in the generated code.
✅ Verification successful
Let me gather more context about the code generation use case to better understand the necessity of this compiler option.
Based on the gathered context, I can now provide the final response:
Setting noImplicitAny: false
is justified in this code generation context
The compiler options are specifically scoped to code generation in createProject()
function, not affecting the main codebase. The configuration is used alongside other strict type checking options (strict: true
), indicating a balanced approach to type safety. The limited usage of any
types in test files suggests good type discipline across the codebase.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for any existing usages of noImplicitAny in the codebase and related discussions
# Search for noImplicitAny configurations
echo "Searching for existing noImplicitAny configurations..."
rg --type ts "noImplicitAny"
# Search for potential implicit any usage in test files
echo "Checking test files for implicit any usage..."
fd -e test.ts -e spec.ts --exec ast-grep --pattern 'function $_($_: any) { $$$ }'
Length of output: 732
Script:
#!/bin/bash
# Check the code generation context and its type handling
echo "Checking code generation implementation..."
rg -A 5 -B 5 "compilerOptions" packages/sdk/src/code-gen.ts
# Look for any type-related error handling or validation in code generation
echo "Checking type-related error handling..."
ast-grep --pattern 'catch ($_) { $$$ }' packages/sdk/src/code-gen.ts
# Check if there are any type assertions or type handling in generated code
echo "Checking type handling patterns..."
rg "as \w+" packages/sdk/src/code-gen.ts
Length of output: 831
packages/sdk/src/types.ts (3)
34-39
: LGTM! Well-documented private property addition.
The new prismaClientDtsPath
property is properly typed, optional (maintaining backward compatibility), and well-documented with the @private
annotation.
83-88
: LGTM! Consistent implementation with PluginOptions.
The property definition matches its counterpart in PluginOptions
, maintaining consistency in typing, optionality, and documentation.
34-39
: Verify type exports and usage across the codebase.
Since this adds new properties that are used across multiple files, let's verify the proper export and usage of these types.
Also applies to: 83-88
✅ Verification successful
Based on the results, I can now generate the final response:
Property prismaClientDtsPath
is properly integrated across the codebase
The property is:
- Correctly defined in
packages/sdk/src/types.ts
- Properly imported and used across multiple packages including schema, plugins (tanstack-query, openapi, trpc)
- Consistently handled in plugin runners and generators
- Used for critical functionality like TypeScript declaration file processing in tanstack-query generator
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify the usage of prismaClientDtsPath across the codebase
# Check if the types are properly exported and imported
rg -l "import.*PluginOptions|PluginResult.*from.*@zenstackhq/sdk"
# Check for actual usage of the new property
rg "prismaClientDtsPath" --type ts
Length of output: 1796
packages/schema/src/language-server/validator/schema-validator.ts (1)
39-40
: Verify handling of @@auth models in imported declarations.
The change to use getDataModelAndTypeDefs
with true
parameter includes imported declarations. Let's verify this doesn't affect the validation of multiple @@auth models across imported files.
✅ Verification successful
Multiple @@auth model validation is properly handled with imported declarations
The code inspection shows that the validation of multiple @@auth models is consistently enforced across both imported and local declarations:
- In
schema-validator.ts
, the validation explicitly checks all declarations (including imports) and raises an error if multiple @@auth models are found - The same validation is also enforced in
cli-util.ts
for command-line operations - The policy enhancer in
policy/utils.ts
expects exactly one auth model across all declarations
The change to use getDataModelAndTypeDefs
with true
parameter maintains the correct validation behavior by ensuring that multiple @@auth models are not allowed, regardless of whether they are in imported files or local declarations.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for potential multiple @@auth models across imported files
# Expected: Should find any test files that verify this scenario
# Look for test files related to auth model validation
rg -l "@@auth.*model" --type ts test/
# Look for specific test cases handling imports
ast-grep --pattern 'describe("auth model", $_) {
$$$
test($_, $_) {
$$$
import
$$$
}
}'
Length of output: 193
Script:
#!/bin/bash
# Let's try a different approach to verify auth model validation:
# 1. Find test files in the correct location
fd "test" --type f --extension ts
# 2. Look for auth model validation tests with more flexible pattern
rg -l "@@auth" --type ts
# 3. Check the implementation of auth model validation
ast-grep --pattern 'function validateAuthModel($_) {
$$$
}'
# 4. Look for import handling in schema validation
rg "getDataModelAndTypeDefs.*true" -A 5 --type ts
Length of output: 13555
packages/schema/src/plugins/enhancer/enhance/model-typedef-generator.ts (1)
20-24
: LGTM! Type definition enhancement aligns with Prisma's behavior.
The change correctly makes optional fields nullable by appending | null
to their type definitions, which improves type safety by accurately reflecting Prisma's handling of optional fields.
Let's verify this aligns with Prisma's type generation:
✅ Verification successful
From analyzing the code, particularly the Prisma builder implementation and test files, I can now make a definitive assessment about the review comment.
Optional fields are correctly made nullable to match Prisma's behavior
The implementation in model-typedef-generator.ts
correctly handles optional fields by making them nullable with | null
. This is verified by:
- The
ModelFieldType
class inprisma-builder.ts
shows that Prisma's own implementation handles optional fields with a?
modifier - The test file
prisma-generator.test.ts
contains examples of optional fields (e.g.,content String?
) which demonstrates this is standard Prisma behavior - The type generation logic correctly maps these optional fields to TypeScript types with both the optional modifier
?
and nullable union| null
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify Prisma's type generation for optional fields
# Expected: Find examples of Prisma-generated types where optional fields are marked as nullable
# Search for Prisma schema files with optional fields
rg -U "(?s)model\s+\w+\s*{[^}]*\?\s*:[^}]*}" --type prisma
# Search for corresponding generated type definitions
rg -U "(?s)export\s+type\s+\w+\s*=\s*{[^}]*\?\s*:[^}]*\|\s*null[^}]*}" --type typescript
Length of output: 465
Script:
#!/bin/bash
# Let's try a simpler approach to find Prisma schema files and type definitions
# First, find Prisma schema files with optional fields
rg "\\?" --type prisma -A 2
# Then, let's look for generated type definitions that include nullable types
rg "type.*\\{" --type typescript -A 5 | rg -B 5 "\\| null"
# Additionally, let's check if there are any Prisma-related test files or examples
fd "prisma" --type f --exec rg -l "\\?"
Length of output: 678
Script:
#!/bin/bash
# Let's examine the Prisma-related files we found for type generation patterns
# Check the Prisma generator test file for type generation examples
cat packages/schema/tests/generator/prisma-generator.test.ts
# Look at the Prisma builder implementation
cat packages/schema/src/plugins/prisma/prisma-builder.ts
# Check Prisma types
cat packages/plugins/prisma-types.ts
# Look for nullable type patterns in these files
rg "\\?" -A 1 packages/schema/tests/generator/prisma-generator.test.ts packages/schema/src/plugins/prisma/prisma-builder.ts packages/plugins/prisma-types.ts
Length of output: 33461
packages/ide/jetbrains/build.gradle.kts (1)
Line range hint 20-22
: Consider narrowing the IDE version compatibility range
The current configuration targets IDE version 2023.3.2 but allows compatibility up to build 251.*, which is quite forward-looking. While this provides flexibility, it might expose the plugin to compatibility issues with future IDE versions.
Also applies to: 53-54
✅ Verification successful
Let me gather more information about the build configuration to better understand the version compatibility settings.
Based on the information gathered, I can now provide a final response regarding the IDE version compatibility range.
The current IDE version compatibility range is appropriate
The configuration in plugin.xml
and build.gradle.kts
follows JetBrains' recommended practices:
since-build="233.2"
matches the target IDE version 2023.3.2untilBuild="251.*"
provides forward compatibility while staying within a reasonable range
This setup is common in IntelliJ plugins and aligns with JetBrains' guidelines for version compatibility. The range allows for:
- Stable support for the current target version (2023.3.2)
- Compatibility with future minor updates
- A reasonable forward-compatibility window that doesn't overextend
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check current IDE plugin compatibility guidelines
# Check marketplace for plugins with similar version ranges
gh api graphql -f query='
{
search(query: "topic:intellij-plugin", type: REPOSITORY, first: 5) {
nodes {
... on Repository {
name
object(expression: "master:build.gradle.kts") {
... on Blob {
text
}
}
}
}
}
}' | jq -r '.data.search.nodes[].object.text' | rg "untilBuild|sinceBuild" -A 1 -B 1
Length of output: 729
Script:
#!/bin/bash
# Check the actual build configuration files for version compatibility settings
fd "build.gradle.kts|plugin.xml" --type f | xargs cat
# Also check for any documentation about version compatibility
fd "README|CONTRIBUTING" --extension md --type f | xargs cat
Length of output: 37606
packages/schema/src/language-server/validator/utils.ts (2)
9-9
: LGTM! Import change looks good.
The addition of isAuthInvocation
import is correctly placed and necessary for the isAuthOrAuthMemberAccess
function.
Line range hint 1-100
: Verify the impact of removed functionality.
The removal of assignableToAttributeParam
function and its related imports appears intentional as part of the validation mechanism updates. Let's verify that this removal doesn't have unintended consequences.
tests/regression/tests/issue-1859.test.ts (2)
7-16
: LGTM: Well-structured test setup
The schema definition with access control rules and test data setup provides good coverage for testing the enhancement and extension scenarios.
Also applies to: 18-19
81-82
: Verify similar access control bypass patterns
The comment suggests this bypass is known behavior. Let's verify if this pattern exists elsewhere in the codebase.
tests/regression/tests/issue-prisma-extension.test.ts (1)
1-3
: LGTM: Clean test suite setup
The import and test suite structure are appropriately defined.
tests/regression/tests/issue-1843.test.ts (1)
14-14
: Review permissive access control settings
The models use @@allow('all', true)
which grants unrestricted access to all operations. This might be intentional for testing but could be dangerous if copied to production code.
Consider:
- Adding proper access control rules based on user roles
- Using more granular permissions (create, read, update, delete) instead of 'all'
- Adding conditions to restrict access based on record ownership
Also applies to: 34-34, 45-45, 51-51
packages/schema/src/plugins/enhancer/enhance/auth-type-generator.ts (4)
8-8
: LGTM: TypeDef import added correctly
The addition of the TypeDef
import aligns with the function signature changes.
27-27
: LGTM: Type-safe property access
The change correctly uses the name
property which exists on both DataModel
and TypeDef
types.
92-94
: Verify getIdFields utility compatibility
Ensure that the getIdFields
utility function supports both DataModel
and TypeDef
types.
✅ Verification successful
Based on the shell script results, I can generate the final response as we have clear evidence about the getIdFields
utility implementation.
getIdFields utility is compatible with both types
The utility function explicitly accepts and handles both DataModel
and TypeDef
types:
- Function signature:
getIdFields(decl: DataModel | TypeDef)
- Type-aware field access:
const fields = isDataModel(decl) ? getModelFieldsWithBases(decl) : decl.fields
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check getIdFields implementation for type support
# Look for getIdFields implementation
ast-grep --pattern 'function getIdFields($_) {
$$$
}'
# Check for TypeDef handling in the implementation
rg -A 10 "getIdFields.*TypeDef"
Length of output: 2573
18-18
: Verify documentation for breaking change
The function signature change from authModel: DataModel
to authDecl: DataModel | TypeDef
is a breaking change that could affect existing consumers.
tests/integration/tests/enhancements/json/validation.test.ts (1)
1-1
: LGTM!
The import statement is clear and properly imports the necessary test utilities.
packages/schema/src/plugins/plugin-utils.ts (1)
37-38
: LGTM! Verify version synchronization.
The improvements to package.json generation look good:
- More descriptive package name
- Dynamic version management with fallback
Let's verify version synchronization across the codebase:
✅ Verification successful
Version synchronization is properly implemented and consistent
The verification shows that:
- The
getVersion()
function is properly implemented in bothpackages/schema/src/utils/version-utils.ts
andpackages/runtime/src/version.ts
- The version is consistently used across the codebase for:
- Generated package.json (
plugin-utils.ts
) - CLI version display
- Telemetry
- Error reporting
- Generated package.json (
- All main packages are synchronized at version 2.9.0, which aligns with the PR being a version merge to v2.9.0
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check version consistency across the codebase
# Search for other version references
echo "Searching for version references:"
rg -g '!{dist,node_modules}/**' -g '!package-lock.json' '"version":\s*"[^"]*"' --type json
# Check getVersion implementation
echo -e "\nChecking getVersion implementation:"
rg -l "getVersion" --type typescript
Length of output: 1484
Script:
#!/bin/bash
# Let's try a different approach to find getVersion implementation and its usage
# Find TypeScript files containing getVersion
echo "Files containing getVersion:"
rg -l "getVersion" --type ts
# Show the implementation and usage context
echo -e "\ngetVersion implementation and usage:"
rg -A 5 "getVersion" --type ts
Length of output: 10591
packages/plugins/tanstack-query/tests/portable.test.ts (1)
1-6
: LGTM: Test setup and imports are well-structured
The necessary imports and type references are properly configured for the test suite.
packages/schema/src/plugins/prisma/index.ts (3)
31-39
: LGTM: Well-structured variable declarations
The new variables are clearly named and properly documented with appropriate default values.
79-82
: LGTM: Proper tracking of custom client output location
The code correctly updates the client output directory tracking when a custom path is specified.
110-110
: Verify TypeScript return type definition
The return statement has been updated to include prismaClientDtsPath
. Ensure that the TypeScript return type for the plugin function is updated accordingly.
✅ Verification successful
Let me gather more information about the PluginResult type to verify if it includes prismaClientDtsPath.
TypeScript return type is correctly defined
The return type is properly defined in the PluginResult
interface which includes all the fields being returned:
warnings: string[]
prismaClientPath?: string
prismaClientDtsPath?: string
dmmf?: DMMF.Document
shortNameMap?: Map<string, string>
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for the plugin function type definition
ast-grep --pattern 'type PluginFunction = $$$'
Length of output: 448
Script:
#!/bin/bash
# Search for PluginResult type definition
ast-grep --pattern 'type PluginResult = $$$'
Length of output: 1548
packages/language/src/zmodel.langium (3)
97-102
: LGTM! Clean implementation of member access.
The introduction of MemberAccessTarget and its usage in MemberAccessExpr provides a well-structured way to handle member access for both DataModel and TypeDef fields.
108-110
: LGTM! Helpful documentation reference.
Good addition of the Mozilla reference for binary operator precedence rules. This helps maintainers understand the grammar's operator precedence design.
69-69
: LGTM! Verify TypeDefField references.
The addition of TypeDefField to ReferenceTarget is consistent with the enhanced type system.
✅ Verification successful
TypeDefField reference integration is properly implemented
The verification shows that TypeDefField is well-integrated into the codebase:
- It's properly validated in
typedef-validator.ts
- It's correctly handled in the linker's type resolution system
- It follows the same patterns as other reference targets (DataModelField, FunctionParam)
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify TypeDefField references in the codebase
# Look for any existing references to TypeDefField to ensure they're valid
rg -l "TypeDefField" --type ts
Length of output: 783
Script:
#!/bin/bash
# Let's examine the actual usage of TypeDefField in reference contexts
# Looking for patterns where TypeDefField is used in reference expressions
# First, let's check the grammar definition
ast-grep --pattern 'interface TypeDefField {
$$$
}'
# Then check for references in validation and linking logic
rg "TypeDefField" -A 5 -B 5 packages/schema/src/language-server/zmodel-linker.ts packages/schema/src/language-server/validator/typedef-validator.ts
Length of output: 4694
packages/schema/src/language-server/validator/function-invocation-validator.ts (2)
Line range hint 98-103
: LGTM! The validation logic remains robust
The changes maintain the integrity of the validation flow while potentially supporting a broader range of field types. The error handling and type checking remain comprehensive.
18-18
: Verify SDK breaking changes
The import change from getDataModelFieldReference
to getFieldReference
suggests a breaking change in the SDK. Ensure that the SDK version is compatible and that all dependent code is updated accordingly.
✅ Verification successful
Import change is part of internal SDK refactoring
The change from getDataModelFieldReference
to getFieldReference
is a valid refactoring within the SDK itself. The new function is properly defined in packages/sdk/src/utils.ts
and is being used correctly in the validator. No remaining usages of the old function name exist in the codebase, indicating a complete migration.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for any remaining usages of the old function name
rg "getDataModelFieldReference" --type ts
# Check package.json for SDK version
cat package.json | jq '.dependencies["@zenstackhq/sdk"]'
Length of output: 105
Script:
#!/bin/bash
# Check if there are any package.json files in the repository
fd package.json
# Look for SDK dependency in all package.json files
fd package.json --exec cat {} \; | jq '.dependencies["@zenstackhq/sdk"]'
# Search for usages of getFieldReference to verify the migration
rg "getFieldReference" --type ts -A 2
Length of output: 2611
packages/schema/src/utils/ast-utils.ts (2)
12-12
: LGTM: Import changes align with TypeDef integration
The new imports support the enhanced type definition handling capabilities mentioned in the PR objectives.
Also applies to: 15-15, 17-17
Line range hint 273-293
: Verify handling of TypeDefs in transitive imports
While the function has been updated to support TypeDefs, there might be an inconsistency in how TypeDefs are handled in transitive imports. The getAllDataModelsIncludingImports
call at line 286 only returns DataModels, potentially missing TypeDefs from imported files.
Let's verify if this is a potential issue:
Consider updating getAllDataModelsIncludingImports
to include TypeDefs or create a new function getAllDataModelsAndTypeDefsIncludingImports
to maintain consistency with the new TypeDef support.
tests/integration/tests/enhancements/with-policy/client-extensions.test.ts (8)
47-49
: LGTM: Comprehensive test coverage for getAll method
The assertions correctly verify the behavior of the custom getAll
method across different enhancement scenarios while respecting the access policy.
83-85
: LGTM: Consistent assertions for model-specific extension
The test properly verifies the behavior of model-specific extensions while maintaining consistency with the all-model test case.
114-118
: LGTM: Simplified test structure while maintaining coverage
The test has been improved by removing unnecessary intermediate variables while still thoroughly testing both enhancement scenarios.
153-154
: LGTM: Proper verification of model-specific query override
The assertions correctly verify the query override behavior while testing both enhancement scenarios.
189-190
: LGTM: Consistent verification of all-model query override
The test maintains consistency with the model-specific override test while verifying the all-model behavior.
225-226
: LGTM: Proper verification of operation-level overrides
The test correctly verifies the behavior of operation-level query overrides across both enhancement scenarios.
295-297
: LGTM: Robust result mutation verification
The test uses appropriate matchers and verifies result mutations across both enhancement scenarios.
331-333
: LGTM: Proper verification of custom computed fields
The test correctly verifies the computation of custom fields while testing both enhancement scenarios.
packages/runtime/src/enhancements/node/proxy.ts (1)
257-274
: LGTM! Clean implementation of $extends
support.
The implementation correctly handles the Prisma $extends
API by:
- Properly wrapping the extended client while preserving the enhancer chain
- Avoiding double wrapping through the
PRISMA_PROXY_ENHANCER
check - Following consistent naming patterns with
$transaction
handling
packages/plugins/openapi/tests/openapi-rpc.test.ts (4)
19-24
: LGTM: Well-structured test parameter addition
The addition of the omitInputDetails
parameter testing is implemented correctly, with proper configuration in the plugin options.
94-101
: LGTM: Clear test setup and logging
The test setup properly handles temporary file creation and provides clear logging of the configuration being tested.
113-133
: LGTM: Comprehensive API validation
The test thoroughly validates all important aspects of the generated OpenAPI specification, including:
- Tag presence and descriptions
- Path configurations and descriptions
- Deprecation settings
- Expected path exclusions
103-111
: Verify existence of baseline files
The test references baseline files with naming pattern rpc-${specVersion}${omitInputDetails ? '-omit' : ''}.baseline.yaml
. Ensure all required baseline files exist.
✅ Verification successful
All baseline files are present and correctly named
All required baseline files exist at their expected locations:
- packages/plugins/openapi/tests/baseline/rpc-3.0.0.baseline.yaml
- packages/plugins/openapi/tests/baseline/rpc-3.0.0-omit.baseline.yaml
- packages/plugins/openapi/tests/baseline/rpc-3.1.0.baseline.yaml
- packages/plugins/openapi/tests/baseline/rpc-3.1.0-omit.baseline.yaml
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for existence of all required baseline files
# List expected baseline files
for spec in "3.0.0" "3.1.0"; do
for omit in "" "-omit"; do
baseline="packages/plugins/openapi/tests/baseline/rpc-${spec}${omit}.baseline.yaml"
echo "Checking for $baseline"
if [ ! -f "$baseline" ]; then
echo "Missing baseline file: $baseline"
fi
done
done
Length of output: 1475
packages/schema/src/cli/cli-util.ts (3)
108-109
: LGTM: Improved document cleanup implementation
The new implementation properly cleans up both document references and index entries, which is more thorough than the previous approach using DocumentBuilder.update
.
111-111
: LGTM: Proper validation sequencing
The validation is correctly positioned after import merging but before base model merging, ensuring proper validation of the complete set of merged declarations.
135-138
: Verify impact of expanded @@auth validation scope
The validation now checks for @@auth
declarations in both data models and type definitions, whereas it previously only checked data models. This expanded scope could potentially flag previously undetected multiple @@auth
declarations.
Let's verify if there are any existing type definitions with @@auth
declarations that might be affected:
✅ Verification successful
The previous ast-grep pattern might not have matched due to syntax specifics. Let's gather more context about where and how @@auth is used in the codebase to verify the impact of this change.
Based on the search results, I can now provide a definitive response:
No impact from expanded @@auth validation scope
The search results show that @@auth
declarations are only used in data models, not in type definitions. While the attribute is technically supported on type definitions (as shown by @@@supportTypeDef
in stdlib.zmodel), there are no existing type definitions using it in the codebase. The change from getDataModels
to getDataModelAndTypeDefs
won't affect existing code.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for type definitions with @@auth declarations
ast-grep --pattern 'type $_ {
$$$
@@auth
$$$
}'
Length of output: 55
Script:
#!/bin/bash
# Let's try a more comprehensive search for @@auth declarations
# First, let's find all files that contain @@auth
rg -l "@@auth"
# Then, let's see the context of these declarations
rg -B2 -A2 "@@auth"
# Also search for type definitions specifically
rg -B2 -A2 "type\s+\w+\s*\{"
Length of output: 26268
packages/plugins/swr/src/generator.ts (1)
262-262
: LGTM: Metadata export addition
The addition of metadata export aligns with the PR's focus on enhancing validation mechanisms. This change makes model metadata accessible to consumers of the generated code.
Let's verify that the metadata file is properly generated:
✅ Verification successful
Based on the verification results, I can now provide a final response:
Metadata export is properly integrated across plugin packages
The verification shows that:
- Both SWR and Tanstack Query plugins consistently handle the metadata:
- Both generate the metadata file using
generateModelMeta
- Both import and use the metadata in their generated code
- Both export the metadata in their index files
- Both generate the metadata file using
- The metadata export addition in SWR plugin aligns with the existing pattern in Tanstack Query plugin
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify the metadata file generation and its exports
# Check if the metadata file is generated
fd -g "__model_meta.ts" -x cat {} \;
# Check if the metadata is imported correctly in other files
rg -l "import.*metadata.*from.*__model_meta"
Length of output: 173
Script:
#!/bin/bash
# Let's check the content of both generator files to understand the metadata usage
echo "=== SWR Generator ==="
rg -A 5 -B 5 "__model_meta" packages/plugins/swr/src/generator.ts
echo -e "\n=== Tanstack Query Generator ==="
rg -A 5 -B 5 "__model_meta" packages/plugins/tanstack-query/src/generator.ts
# Let's also check how the metadata is generated
ast-grep --pattern 'function generateMetadata($$$) {
$$$
}'
Length of output: 2851
Script:
#!/bin/bash
# Let's check the generateModelMeta function implementation
ast-grep --pattern 'async function generateModelMeta($$$) {
$$$
}'
# Also check if there are any tests validating this functionality
rg -l "generateModelMeta|__model_meta" -g "*.test.ts" -g "*.spec.ts"
Length of output: 141
packages/schema/src/cli/plugin-runner.ts (4)
139-140
: LGTM: Variable declaration follows best practices
The declaration of prismaClientDtsPath
is well-typed and appropriately placed alongside related path variables.
185-185
: LGTM: Type casting is appropriate
The type cast to PluginOptions
is necessary and correctly implemented to ensure type safety with the extended options object.
170-171
: Verify type definitions for plugin result
The code assumes the plugin result includes prismaClientDtsPath
. Please ensure the type definitions are updated accordingly.
✅ Verification successful
Type definition for prismaClientDtsPath
is properly defined
The PluginResult
type in packages/sdk/src/types.ts
already includes the prismaClientDtsPath
property as an optional string field with proper documentation.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check if PluginResult type includes prismaClientDtsPath
# Expected: Find type definition with prismaClientDtsPath property
# Search for PluginResult type definition
ast-grep --pattern 'type PluginResult = {
$$$
prismaClientDtsPath?: $_
$$$
}'
# Backup search using ripgrep
rg -A 10 'type\s+PluginResult'
Length of output: 4051
179-179
: Verify user plugin compatibility with new option
The new prismaClientDtsPath
is being passed to user plugins. Ensure that existing plugins can handle this additional option gracefully.
✅ Verification successful
Based on the shell script results, I can now generate the final response:
The prismaClientDtsPath
option is safely handled across the codebase
The verification shows that:
- The option is properly typed in the SDK's type definitions (
packages/sdk/src/types.ts
) - Core plugins like
prisma
andenhancer
handle this path correctly - The
tanstack-query
plugin explicitly checks for this option and handles missing values appropriately with clear error messages
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check if any plugins explicitly use prismaClientDtsPath
# Expected: Find usage patterns or documentation of the new option
# Search for direct usage in plugin files
rg -g '!node_modules' -g '!dist' -A 5 'prismaClientDtsPath'
# Search for plugin option type definitions
ast-grep --pattern 'interface $_Options {
$$$
prismaClientDtsPath?: $_
$$$
}'
Length of output: 5820
packages/sdk/src/model-meta-generator.ts (2)
29-29
: LGTM! Function rename improves clarity.
The rename from getAuthModel
to getAuthDecl
better reflects the function's purpose of handling both data models and type definitions.
103-103
: LGTM! Parameter update aligns with function signature change.
The addition of typeDefs
parameter ensures that type definitions are properly handled in auth model generation.
packages/schema/src/plugins/enhancer/policy/utils.ts (2)
9-10
: LGTM! Import changes align with the function renaming.
The new imports reflect the architectural changes mentioned in the summary, where getAuthModel
was renamed to getAuthDecl
and getDataModelAndTypeDefs
was introduced.
Line range hint 523-533
: LGTM! Auth model retrieval logic is robust.
The updated implementation maintains proper error handling while adopting the new model retrieval approach. The code appropriately validates the auth model and its ID fields.
packages/schema/src/res/stdlib.zmodel (3)
642-642
: Validation Enhancement: Complex Validation Rules for Type Definitions
The addition of @@@supportTypeDef
to the @@validate
attribute enables type definitions to have complex validation rules. This enhancement promotes code reuse by allowing validation logic to be defined once in a type and reused across multiple models.
Let's verify potential validation patterns:
#!/bin/bash
# Search for type definitions with validation rules
ast-grep --pattern 'type $name {
$$$
@@validate($_, $_)
$$$
}'
212-212
: Architectural Enhancement: ID Support for Type Definitions
The addition of @@@supportTypeDef
to the @id
attribute enables type definitions to have ID fields, which is a significant architectural enhancement. This allows for better modeling of reusable ID-containing types.
Let's verify the impact on existing type definitions:
✅ Verification successful
The addition of @@@supportTypeDef
to @id
attribute is a valid enhancement
Based on the search results, we can see that:
- Type definitions are already being used extensively in test files with various attributes
- There are existing test cases showing
@id
being used within type definitions (e.g.,type User { id Int @id }
) - The codebase has validation logic to ensure
@id
is only used with scalar or enum types, which is consistent across both models and type definitions
The enhancement to officially support @id
in type definitions through @@@supportTypeDef
aligns with existing usage patterns and validation rules in the codebase.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for type definitions that might benefit from this enhancement
ast-grep --pattern 'type $name {
$$$
}'
Length of output: 43
Script:
#!/bin/bash
# Let's try a different approach to find type definitions using ripgrep
rg "type\s+\w+\s*\{" -A 5
# Also search for any existing usage of @id with type definitions
rg "@id.*type" -A 5
Length of output: 24952
539-539
: Security Enhancement: Auth Rules for Type Definitions
The addition of @@@supportTypeDef
to the @@auth
attribute enables type definitions to have authentication rules. This is a powerful security enhancement that allows for consistent auth rules across multiple models using shared types.
Let's verify potential usage patterns:
✅ Verification successful
Based on the code analysis, I can now generate the final response:
Verified: Auth Rules Support for Type Definitions
The addition of @@@supportTypeDef
to the @@auth
attribute is correctly implemented and validated. The attribute application validator explicitly checks for this support tag when validating attributes on type definitions:
if (isTypeDefField(targetDecl) && !hasAttribute(decl, '@@@supportTypeDef')) {
accept('error', `attribute "${decl.name}" cannot be used on type declaration fields`);
}
if (isTypeDef(targetDecl) && !hasAttribute(decl, '@@@supportTypeDef')) {
accept('error', `attribute "${decl.name}" cannot be used on type declarations`);
}
This ensures that only attributes explicitly marked with @@@supportTypeDef
can be used on type definitions, making the auth rules support a well-validated feature.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for type definitions that might benefit from auth rules
ast-grep --pattern 'type $name {
$$$
@@auth()
$$$
}'
Length of output: 164
Script:
#!/bin/bash
# Search for type definitions with auth rules using ripgrep
rg "type\s+\w+\s*{[\s\S]*?@@auth" -A 5
# Also search for any usage of @@@supportTypeDef to understand the context
rg "@@@supportTypeDef"
# Look for type definitions in general to understand the pattern
rg "type\s+\w+\s*{"
Length of output: 3432
Script:
#!/bin/bash
# Let's try a simpler pattern to find type definitions
rg "^type\s+" -A 5
# Look for files that might contain example type definitions
fd -e zmodel -e prisma -e schema
# Check the validator implementation to understand the type definition support
cat packages/schema/src/language-server/validator/attribute-application-validator.ts
Length of output: 40174
packages/schema/src/plugins/prisma/schema-generator.ts (1)
33-33
: LGTM: Import statements are appropriate.
The new imports are well-chosen:
getIdFields
is used for handling ID fields in model relationshipsmatch
from ts-pattern provides type-safe pattern matching
Also applies to: 35-35
packages/plugins/openapi/src/rest-generator.ts (2)
37-43
: LGTM! Constructor improvements enhance type safety and validation.
The changes improve the code in two ways:
- Better type safety by replacing
any
withPluginOptions
- Early validation of unsupported options
871-874
: LGTM! Improved REST API design by hiding implementation details.
Good change to exclude foreign key fields from create/update operations, following REST API best practices by not exposing internal database implementation details to API consumers.
packages/schema/src/plugins/zod/utils/schema-gen.ts (3)
Line range hint 27-28
: Ensure 'Decimal' type is correctly identified
The variable isDecimal
is set using field.type.type === 'Decimal'
. Confirm that 'Decimal'
is the correct string representation of the Decimal type in your context. If there's a possibility of the type reference being used instead, consider checking field.type.reference?.ref?.name
.
Please verify the correctness of this type check.
Line range hint 121-133
: Ensure 'now' function is correctly referenced in '@default' handling
In the getFieldSchemaDefault
function, when handling the 'now'
function from the standard library, ensure that the reference to 'now'
is accurate and that it aligns with the updated imports, especially since imports have been modified.
Please verify that 'now'
is correctly referenced and that isFromStdlib
works as expected. You can run the following script to check all usages of 'now'
in the codebase:
#!/bin/bash
# Description: Verify that 'now' function references are correct throughout the codebase.
# Test: Search for all occurrences of 'now' function usage.
# Expect: All references should be correct and consistent.
rg -A 2 -B 2 $'\bisFromStdlib\b.*\bnow\b'
Line range hint 11-15
: Import missing dependencies or remove unused ones
The function isDefaultWithAuth
is imported from '../../enhancer/enhancer-utils'
. Ensure that this import path is correct and that the module exports isDefaultWithAuth
. If not, update the import path or definition accordingly.
Please verify that isDefaultWithAuth
is properly imported and used. You can check for its definition with:
✅ Verification successful
Import path and function are correctly defined and used
The verification shows that:
isDefaultWithAuth
is properly exported frompackages/schema/src/plugins/enhancer/enhancer-utils.ts
- The import path
'../../enhancer/enhancer-utils'
in the review file is correct relative to its location - The function is actively used in multiple files including the file under review (
schema-gen.ts
)
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify the import and usage of 'isDefaultWithAuth'.
# Test: Find the definition of 'isDefaultWithAuth'.
# Expect: The file should export 'isDefaultWithAuth'.
fd 'enhancer-utils.ts' --exec grep -Hn 'export function isDefaultWithAuth' {}
Length of output: 38
Script:
#!/bin/bash
# Let's try different approaches to verify the import and usage
# 1. Search for any export of isDefaultWithAuth (not just function)
rg -l "export.*isDefaultWithAuth"
# 2. Search for the definition of isDefaultWithAuth
ast-grep --pattern "export function isDefaultWithAuth"
# 3. Check the actual enhancer-utils.ts file content
fd "enhancer-utils.ts" --exec cat {}
# 4. Check where isDefaultWithAuth is being used
rg "isDefaultWithAuth"
Length of output: 1934
packages/schema/src/language-server/zmodel-scope.ts (9)
12-13
: Import isTypeDef
and isTypeDefField
for extended type handling
The addition of isTypeDef
and isTypeDefField
imports is appropriate for the enhanced handling of type definitions.
37-40
: Update imports to include type definitions
Including getAllLoadedAndReachableDataModelsAndTypeDefs
ensures that both data models and type definitions are considered during scope computation.
43-43
: Import isAuthOrAuthMemberAccess
for authorization checks
The import of isAuthOrAuthMemberAccess
from ./validator/utils
is necessary for determining when to include type definition scopes in authorization contexts.
139-142
: Determine when to include TypeDef fields in scope
The introduction of allowTypeDefScope
correctly identifies when type definition fields should be included in the scope, based on whether the access starts with auth()
or is within a type definition.
147-148
: Include TypeDef fields in scope resolution
By extending the condition to check isTypeDefField(ref)
, the scope resolution now appropriately includes fields from type definitions.
170-171
: Ensure auth()
resolves correctly with type definitions
The update to resolve auth()
to include type definitions by using createScopeForAuth
enhances the authorization scope handling.
187-189
: Properly set allowTypeDefScope
in collection predicates
Determining allowTypeDefScope
based on whether the collection starts with auth()
ensures consistent inclusion of type definitions in collection predicate scopes.
222-226
: Extend createScopeForContainer
to handle type definitions
The modification to createScopeForContainer
now includes an includeTypeDefScope
parameter, allowing it to handle both data models and type definitions as needed.
232-241
: Enhance createScopeForAuth
to include type definitions
By retrieving both data models and type definitions, createScopeForAuth
ensures that all relevant declarations are included when resolving authorization scopes.
packages/runtime/src/cross/mutator.ts (6)
4-4
: Import statement updated appropriately
The added imports are necessary for the new functionalities and types used in the code.
37-39
: Proper validation of queryData
The added check ensures that queryData
is either an object or an array before proceeding, which prevents potential runtime errors.
41-45
: Validation of queryOp
makes the function more robust
By ensuring that queryOp
starts with 'find'
, we guarantee that mutations are only applied to appropriate query results.
293-296
: Handle relation fields without connect
appropriately
The current logic skips relation fields without connect
. Consider whether other nested mutation operations like create
or update
should also be handled.
Run the following script to check for other nested mutation operations:
#!/bin/bash
# Description: Search for usage of nested mutations other than 'connect' in mutation arguments.
# Search for 'create', 'update', 'upsert' operations in mutateArgs.data
rg "mutateArgs\.data.*\b(create|update|upsert)\b" packages/runtime/src/cross/mutator.ts
# Expected Result: Identify if other operations are being used and handle them if necessary.
429-445
: Ensure assignForeignKeyFields
handles missing mappings gracefully
If field.foreignKeyMapping
is not defined or incomplete, the function should handle it without causing errors.
Run the following script to verify that all fields have foreignKeyMapping
where necessary:
#!/bin/bash
# Description: Verify all relation fields have a defined `foreignKeyMapping`.
# Find all relation fields and check for `foreignKeyMapping`
rg "isDataModel: true" packages/runtime/src/ -A 5 | rg -B 5 -A 5 "foreignKeyMapping"
# Expected Result: All relation fields should have an associated `foreignKeyMapping`.
127-128
: Ensure modelFields
is valid before proceeding
It's good practice to check if modelFields
is not undefined
to prevent possible runtime errors when accessing its properties.
Run the following script to ensure modelFields
is always defined when used:
packages/schema/src/language-server/zmodel-linker.ts (7)
27-27
: Imports updated correctly to include new types.
The additions of TypeDefFieldType
and isTypeDefField
to the imports ensure that the new type definitions are appropriately recognized in the file.
Also applies to: 38-38
40-40
: Imported functions from @zenstackhq/sdk
are correctly updated.
The import statement now includes getAuthDecl
, aligning with the updated usage in the code.
58-59
: New utility functions imported appropriately.
The imports of getAllLoadedAndReachableDataModelsAndTypeDefs
, getContainingDataModel
, and isMemberContainer
support the enhanced type resolution mechanisms introduced.
287-287
: Enhanced type handling in resolveInvocation
with updated declarations.
By switching to getAllLoadedAndReachableDataModelsAndTypeDefs
, the code now considers both data models and type definitions when retrieving declarations. The use of getAuthDecl
aligns with this update and ensures authentication declarations are accurately resolved.
Also applies to: 292-294
325-325
: Improved member access resolution using isMemberContainer
.
Updating the condition to use isMemberContainer
instead of isDataModel
broadens the scope to include all member-containing declarations, enhancing the flexibility and correctness of member access resolution.
343-343
: Correct handling of collection predicates with updated type checks.
The use of isMemberContainer
ensures that collection predicates are properly resolved when dealing with arrays of member-containing declarations, improving the robustness of predicate resolution.
519-519
: Ensure comprehensive handling of TypeDefFieldType
in resolveToDeclaredType
.
The method resolveToDeclaredType
now accepts TypeDefFieldType
, and the conditional checks have been updated accordingly. Verify that all calls to this method pass the correct type parameters and that the logic correctly handles nullability and array attributes for TypeDefFieldType
.
As a follow-up, you can run the following script to find all usages of resolveToDeclaredType
and review the passed types:
Also applies to: 521-521
✅ Verification successful
Let me gather more information about the implementation of resolveToDeclaredType
to verify how it handles TypeDefFieldType
.
Let me try a different approach to get the implementation using ripgrep with context.
TypeDefFieldType
is properly handled in resolveToDeclaredType
The implementation correctly handles TypeDefFieldType
:
- Nullability check is performed via
isTypeDefField(type)
condition - Array attributes are properly handled through
type.array
- Both built-in types (
type.type
) and reference types (type.reference
) are supported - The resolution logic is consistent across all parameter types
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Find all calls to `resolveToDeclaredType` and check parameter types.
# Expected: Calls pass FunctionParamType, DataModelFieldType, or TypeDefFieldType.
rg 'resolveToDeclaredType\(' packages/ -A 1
Length of output: 1388
Script:
#!/bin/bash
# Get the implementation of resolveToDeclaredType to verify TypeDefFieldType handling
ast-grep --pattern 'private resolveToDeclaredType($_: $_,$_: $_) {
$$$
}'
Length of output: 77
Script:
#!/bin/bash
# Get the implementation of resolveToDeclaredType with more context
rg "private resolveToDeclaredType" packages/schema/src/language-server/zmodel-linker.ts -A 30
Length of output: 1187
packages/sdk/src/utils.ts (5)
388-390
: New Function getForeignKeyFields
Added
The getForeignKeyFields
function provides a useful utility to retrieve foreign key fields from a relation field.
135-138
: Ensure Compatibility in hasAttribute
Function
The hasAttribute
function now includes TypeDef
and TypeDefField
in its decl
parameter. Verify that hasAttribute
correctly handles these types and that any nested function calls (like getAttribute
) are compatible.
30-31
: Imported New Types for Type Definitions
New imports for isTypeDef
, isTypeDefField
, TypeDef
, and TypeDefField
have been added to support type definitions.
Also applies to: 35-35
55-65
: Addition of getDataModelAndTypeDefs
Function
The new getDataModelAndTypeDefs
function is a valuable addition that retrieves both data models and type definitions. The implementation appears correct and aligns with existing patterns.
482-485
: Update Function Calls to getAuthDecl
The getAuthModel
function has been renamed to getAuthDecl
, and its parameter type has changed to (DataModel | TypeDef)[]
. Ensure that all references to getAuthModel
are updated to getAuthDecl
throughout the codebase to prevent potential runtime errors.
Run the following script to identify and update outdated references:
✅ Verification successful
No References to getAuthModel
Found - Function Rename is Clean
The verification shows that there are no occurrences of getAuthModel
in the codebase. All existing references are already using the new getAuthDecl
function name, which is properly imported and used across multiple files including:
- packages/schema/src/plugins/enhancer/policy/utils.ts
- packages/schema/src/plugins/enhancer/enhance/index.ts
- packages/schema/src/language-server/zmodel-scope.ts
- packages/schema/src/language-server/zmodel-linker.ts
- packages/sdk/src/model-meta-generator.ts
The function rename appears to have been completed thoroughly with no remaining references to update.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Find and update usages of `getAuthModel` to `getAuthDecl`.
# Search for usages of `getAuthModel` in the codebase.
rg 'getAuthModel' --files-with-matches
# Optionally, automate the replacement (use with caution).
# rg 'getAuthModel' -l | xargs sed -i 's/getAuthModel/getAuthDecl/g'
Length of output: 38
Script:
#!/bin/bash
# Let's try a more comprehensive search using ripgrep with context
rg -A 3 'getAuthModel|getAuthDecl'
# Also search for any potential imports or exports of these functions
ast-grep --pattern 'export function getAuthModel($_) { $$$ }'
ast-grep --pattern 'export function getAuthDecl($_) { $$$ }'
Length of output: 4469
packages/schema/src/plugins/zod/generator.ts (7)
2-23
: Appropriate addition of necessary imports for validation functionality
The added imports are required for the new validation features and are correctly included.
25-25
: Correct import of additional types and utilities from '@zenstackhq/sdk/ast'
The new imports are necessary for handling type definitions and validations.
36-36
: Addition of 'makeFieldSchema' import is appropriate
The imported function is used for generating field schemas in the new code.
154-154
: Simplification of saving generated files using 'forEach'
The change to use forEach
for saving files enhances code clarity and is appropriate.
Line range hint 304-340
: Correctly adding validation refinements for type definitions
The implementation accurately compiles @@validate
attributes into Zod refinements for type definitions.
461-461
: Integrating validation refinements into model schemas
The addition of validation refinements enhances schema accuracy and is correctly implemented.
668-735
: Implementation of validation refinements is correct and robust
The methods createRefineFunction
and makeValidationRefinements
effectively compile @@validate
attributes into Zod refinements, and error handling is properly managed.
packages/schema/src/plugins/enhancer/enhance/index.ts (7)
6-10
: Imports are appropriate and necessary
The added imports getAuthDecl
, getDataModelAndTypeDefs
, and getRelationField
are necessary for the new functionalities introduced in the code.
99-101
: Correct usage of authentication declarations
The code correctly retrieves the authentication declaration using getAuthDecl
and generates the necessary authentication types and parameters.
143-148
: Properly returning newPrismaClientDtsPath
The generate
method correctly returns newPrismaClientDtsPath
when prismaTypesFixed
is true
, ensuring that downstream code can access the updated Prisma Client type definitions.
245-245
: Accurate determination of needsLogicalClient
The needsLogicalClient
getter correctly determines if a logical Prisma client generation is necessary based on the presence of delegate models, defaults with auth()
, or type definitions.
460-466
: Ensure safe modification of variable.type
When modifying variable.type
, ensure that it's safe to assign a new string value to it. If variable.type
expects a specific structure or type, directly assigning a string might cause issues.
Please verify that assigning variable.type = source;
is appropriate, and consider using the setType()
method if available.
755-761
: Correctly generating extra types for type definitions
The generateExtraTypes
method correctly iterates over the model declarations and generates types for type definitions using generateTypeDefType
.
792-799
: Utility functions removeFromSource
and trimEmptyLines
are implemented correctly
The removeFromSource
function effectively removes specified text from the source code, and trimEmptyLines
cleans up any resulting empty lines, ensuring the source code remains well-formatted.
packages/plugins/openapi/src/rpc-generator.ts (19)
26-26
: Declaration of ANY_OBJECT
Constant
Defining const ANY_OBJECT = '_AnyObject';
introduces a catch-all object type for use when omitInputDetails
is enabled. This is appropriate for handling generalized input schemas.
37-38
: Addition of omitInputDetails
Property
Introducing private omitInputDetails: boolean;
allows control over the inclusion of input details in the generated OpenAPI specification, enhancing configurability.
39-46
: Constructor Update and Option Validation
The constructor now accepts PluginOptions
and initializes omitInputDetails
. Proper validation is in place, throwing a PluginError
if omitInputDetails
is not a boolean. This ensures robustness against invalid configurations.
166-168
: Use omittableRef
for Input Properties in create
Operation
The create
operation now uses this.omittableRef
for select
, include
, and data
fields. This change allows omission of detailed input types when omitInputDetails
is true, providing flexibility in the generated API specification.
192-193
: Consistent Application of omittableRef
in createMany
Operation
The createMany
operation inputs now utilize this.omittableRef
, ensuring consistent handling of input details across operations.
222-224
: Update findUnique
Operation Inputs with omittableRef
Using this.omittableRef
for the select
, include
, and where
properties in the findUnique
operation aligns with the new omitInputDetails
option.
245-247
: Apply omittableRef
to findFirst
Operation Inputs
The findFirst
operation's input properties now use this.omittableRef
, maintaining uniformity in input detail handling.
268-270
: Use omittableRef
in findMany
Operation
The findMany
operation inputs are updated to this.omittableRef
, ensuring that input details are conditionally included based on the omitInputDetails
flag.
292-295
: Update update
Operation Inputs with omittableRef
The update
operation now consistently applies this.omittableRef
to its input properties, aligning with the handling of input details.
317-318
: Apply omittableRef
in updateMany
Operation
Using this.omittableRef
for where
and data
in the updateMany
operation ensures consistent input detail management.
340-344
: Update upsert
Operation Inputs with omittableRef
The upsert
operation inputs are now correctly using this.omittableRef
, providing flexibility in the generated API according to the omitInputDetails
configuration.
366-368
: Use omittableRef
in delete
Operation
The delete
operation's input properties now use this.omittableRef
, maintaining consistency across CRUD operations.
389-389
: Apply omittableRef
to deleteMany
Operation
Updating the deleteMany
operation to use this.omittableRef
for the where
property aligns with the input detail handling strategy.
410-411
: Update count
Operation Inputs with omittableRef
The count
operation's select
and where
properties now utilize this.omittableRef
, ensuring input details are included or omitted as configured.
440-442
: Use omittableRef
in aggregate
Operation
Applying this.omittableRef
to the aggregate
operation inputs for properties like where
, orderBy
, and cursor
ensures consistent handling of input details.
466-469
: Apply omittableRef
to groupBy
Operation Inputs
The groupBy
operation inputs now use this.omittableRef
, maintaining uniformity in the inclusion of input details across operations.
602-614
: Consistent Use of omittableRef
in Aggregate Fields
Aggregate fields such as _count
, _min
, _max
, _sum
, and _avg
now use this.omittableRef
. This change ensures that the inclusion of aggregate details is controlled by the omitInputDetails
flag.
632-638
: Conditional Generation of _AnyObject
Schema
When omitInputDetails
is true, the _AnyObject
schema is generated as a catch-all object with additionalProperties: true
. This implementation appropriately handles cases where detailed input schemas are omitted.
847-853
: Implementation of omittableRef
Method
The omittableRef
method correctly returns a reference to ANY_OBJECT
when omitInputDetails
is true, and defaults to this.ref
otherwise. This method centralizes the logic for conditional input detail inclusion.
packages/schema/tests/schema/validation/attribute-validation.test.ts (4)
Line range hint 1-3
: Correct inclusion of Jest types and utility imports.
The file appropriately references Jest types and imports required testing utilities, ensuring the tests have access to necessary functions and type definitions.
Line range hint 5-21
: Well-structured test for built-in field attributes.
The test case effectively verifies the application of built-in attributes like @id
, @default
, @unique
, and @map
on model fields. It ensures that these attributes are correctly recognized and processed by the model loader.
Line range hint 372-724
: Comprehensive testing of policy expressions is commendable.
The test cases thoroughly cover a variety of valid and invalid policy expressions, including operand type validations and operator compatibility checks. This extensive coverage enhances the robustness of the validation logic.
Line range hint 833-846
: Correct validation of attribute usage on type declarations.
The test accurately checks that attributes like @omit
cannot be applied to fields within type declarations, enforcing the correct usage rules for attributes.
packages/runtime/src/enhancements/node/delegate.ts (1)
637-666
: Verify Deletion of Foreign Key Fields Does Not Affect Required Data
After transforming foreign key assignments to connect
structures, the code deletes the original foreign key fields using keysToDelete
. Ensure that this deletion does not inadvertently remove necessary data.
Run the following script to check for usages of foreign key fields after they have been deleted:
✅ Verification successful
Let me analyze the code further to understand the usage of deleted foreign key fields.
Let me try a different approach to analyze the code.
After analyzing the code, I can now provide a final response regarding the deletion of foreign key fields.
Foreign Key Field Deletion is Safe and Intentional
The deletion of foreign key fields after transforming them to connect
structures is safe and intentional. The code follows these steps:
- Foreign key fields are only deleted after being properly transformed into equivalent
connect
structures - The deletion happens at the end of the
fkAssignmentToConnect
method - The transformed data is used in the subsequent operations
- No further access to the original foreign key fields occurs after deletion
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify that foreign key fields are not used after deletion.
# Expected: No references to deleted foreign key fields in subsequent code.
rg 'args\[(.*)\]' packages/runtime/src/enhancements/node/delegate.ts | grep -E "$(printf '%s|' "${keysToDelete[@]}" | sed 's/|$//')"
Length of output: 1379
Script:
#!/bin/bash
# Get the complete implementation of fkAssignmentToConnect and surrounding context
ast-grep --pattern 'private fkAssignmentToConnect($_) {
$$$
}'
Length of output: 66
Script:
#!/bin/bash
# Get the complete file content to understand the flow
cat packages/runtime/src/enhancements/node/delegate.ts
Length of output: 51789
packages/plugins/tanstack-query/tests/react-hooks-v5.test.tsx (6)
284-346
: Test case optimistic create updating nested query
is well implemented
The test correctly checks the optimistic creation of a post associated with a user and verifies that the cache updates as expected.
348-400
: Test case optimistic nested create updating query
functions correctly
The implementation accurately tests the optimistic creation of nested data and ensures the cache reflects the new state.
Line range hint 541-591
: Test case optimistic update simple
is properly structured
The test effectively validates the optimistic update functionality for simple updates and verifies cache behavior.
593-648
: Consider handling nested optimistic updates more robustly
While the test for optimistic update updating nested query
is correct, ensure that the optimistic update handles complex nested structures appropriately, especially concerning deep merges in the cache.
Would you like to confirm that nested updates are merging correctly in the cache? I can assist with further tests or validations.
Line range hint 895-947
: Review the logic in optimistic upsert - update simple
test case
Ensure that the optimistic upsert operation correctly differentiates between create and update paths and that the optimistic updates reflect the accurate state in each scenario.
Would you like assistance in writing additional assertions to validate both create and update paths within the upsert operation?
765-828
:
Ensure consistency in nock
reply data
In the test cases involving upsert
, the nock
mock replies with { data: null }
. Verify that this aligns with expected API behavior and that the application handles such responses correctly.
Run the following script to search for instances where nock
replies with { data: null }
and confirm it's intentional:
✅ Verification successful
The { data: null }
response in upsert tests is intentional and correct
After analyzing the test file patterns, the { data: null }
responses in upsert operations are intentionally used in optimistic update test cases where:
- The actual mutation response is not important because the test verifies optimistic updates in the cache
- The test specifically checks that the optimistic update works correctly before the server responds
- The pattern is consistently used across multiple similar test cases in the file
The test is working as intended, verifying that the optimistic updates are applied correctly to the cache regardless of the actual server response.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Find all nock reply statements returning { data: null }
rg -A 2 "nock\(.*\)\s*\.\w+\(/.*\/\)\s*\.\w+\(200, \(\) => \{\s*return \{ data: null \};" tests/
Length of output: 267
Script:
#!/bin/bash
# Search for nock reply patterns with null data, accounting for various syntax
rg -U "\.reply\(200,.*data:\s*null" --type ts
Length of output: 47
Script:
#!/bin/bash
# Search for all nock reply patterns to understand the common response formats
rg -U "\.reply\(200,.*\{.*data:" --type ts -A 2
Length of output: 49
Script:
#!/bin/bash
# Find all test files that use nock with upsert operations
rg "upsert.*nock" --type ts -B 2 -A 2
Length of output: 39
Script:
#!/bin/bash
# Search for nock reply patterns in the specific test file
rg -U "\.reply\(200," packages/plugins/tanstack-query/tests/react-hooks-v5.test.tsx -A 3
Length of output: 8069
Script:
#!/bin/bash
# Look for other upsert test cases in the same file
rg "it\(.*upsert" packages/plugins/tanstack-query/tests/react-hooks-v5.test.tsx -A 10
Length of output: 2587
packages/plugins/tanstack-query/src/generator.ts (6)
17-17
: LGTM!
The import of the fs
module is appropriate for file system operations used in the generateBundledTypes
function.
49-56
: LGTM!
The validation and initialization of the portable
option are correctly implemented, ensuring it defaults to false
when not provided.
73-76
: LGTM!
The conditional invocation of generateBundledTypes
based on the portable
option functions as intended.
349-349
: LGTM!
The conditional assignment of prismaImport
aligns with the portable
option, ensuring correct import paths.
598-598
: LGTM!
The addition of exporting metadata
enhances the module's exports, making model metadata available for external use.
740-756
: 🛠️ Refactor suggestion
Ensure error handling and verify license compliance when copying type definitions
The generateBundledTypes
function copies type definitions from PrismaClient. Please consider the following:
-
License Compliance: Copying files from PrismaClient may have licensing implications. Verify that this action complies with Prisma's license and terms of use.
-
Error Handling: Currently, exceptions during file operations (e.g., reading or writing files) are not being handled. Consider adding try-catch blocks to manage potential errors from
fs.readFileSync
andproject.createSourceFile
.
To verify license compliance, you can check Prisma's license terms:
packages/schema/src/plugins/zod/transformer.ts (7)
2-8
: LGTM: Import statements are correct.
The new imports from @zenstackhq/sdk
are appropriately added and correctly used in the code.
68-68
: LGTM: Correct generation of enum schema content.
The content
variable is properly constructed to generate the enum schema using this.generateImportZodStatement()
and this.generateExportSchemaStatement()
.
81-81
: LGTM: Handling of extra enums is correct.
The code correctly generates enum schema content for enums not referenced by data models, ensuring comprehensive enum support.
116-116
: LGTM: Concatenation of extra imports and object schema.
The content
variable combines extraImports
and objectSchema
effectively, ensuring all necessary imports are included.
250-251
: LGTM: Passing contextField
to generatePrismaStringLine
.
The contextField
is accurately retrieved and passed to generatePrismaStringLine
, enabling context-aware schema generation.
325-330
: Method signature updated correctly with contextField
.
The generatePrismaStringLine
method signature now includes contextField
, which is appropriately typed as DataModelField | undefined
.
345-348
: Optionality logic enhanced to consider ZModel field inference.
The optional
variable now accurately reflects field optionality by checking both !field.isRequired
and this.isFieldOptional(contextField)
, ensuring that fields with defaults or inferred optionality are correctly handled.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Outside diff range and nitpick comments (2)
packages/sdk/src/code-gen.ts (2)
24-37
: Consider using async file operationsThe function uses
saveSync()
which could block the event loop for large files. Consider using asynchronous operations for better performance.-export function saveSourceFile(sourceFile: SourceFile) { +export async function saveSourceFile(sourceFile: SourceFile) { sourceFile.replaceWithText( `/****************************************************************************** * This file was generated by ZenStack CLI ${pkgJson.version}. ******************************************************************************/ /* eslint-disable */ // @ts-nocheck ${sourceFile.getText()}` ); sourceFile.formatText(); - sourceFile.saveSync(); + await sourceFile.save(); }
30-31
: Reconsider blanket suppression of type checkingThe blanket suppression of ESLint and TypeScript checking might hide legitimate issues. Consider using more targeted suppression or documenting why this is necessary.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
⛔ Files ignored due to path filters (32)
packages/plugins/trpc/tests/projects/nuxt-trpc-v10/server/trpc/routers/generated/client/Post.nuxt.type.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v10/server/trpc/routers/generated/client/User.nuxt.type.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v10/server/trpc/routers/generated/client/nuxt.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v10/server/trpc/routers/generated/client/utils.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v10/server/trpc/routers/generated/helper.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v10/server/trpc/routers/generated/routers/Post.router.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v10/server/trpc/routers/generated/routers/User.router.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v10/server/trpc/routers/generated/routers/index.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v11/server/trpc/routers/generated/client/Post.nuxt.type.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v11/server/trpc/routers/generated/client/User.nuxt.type.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v11/server/trpc/routers/generated/client/nuxt.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v11/server/trpc/routers/generated/client/utils.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v11/server/trpc/routers/generated/helper.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v11/server/trpc/routers/generated/routers/Post.router.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v11/server/trpc/routers/generated/routers/User.router.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/nuxt-trpc-v11/server/trpc/routers/generated/routers/index.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v10/src/server/api/routers/generated/client/Post.next.type.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v10/src/server/api/routers/generated/client/User.next.type.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v10/src/server/api/routers/generated/client/next.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v10/src/server/api/routers/generated/client/utils.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v10/src/server/api/routers/generated/helper.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v10/src/server/api/routers/generated/routers/Post.router.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v10/src/server/api/routers/generated/routers/User.router.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v10/src/server/api/routers/generated/routers/index.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v11/src/server/api/routers/generated/client/Post.react.type.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v11/src/server/api/routers/generated/client/User.react.type.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v11/src/server/api/routers/generated/client/react.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v11/src/server/api/routers/generated/client/utils.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v11/src/server/api/routers/generated/helper.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v11/src/server/api/routers/generated/routers/Post.router.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v11/src/server/api/routers/generated/routers/User.router.ts
is excluded by!**/generated/**
,!**/generated/**
packages/plugins/trpc/tests/projects/t3-trpc-v11/src/server/api/routers/generated/routers/index.ts
is excluded by!**/generated/**
,!**/generated/**
📒 Files selected for processing (2)
packages/ide/jetbrains/CHANGELOG.md
(1 hunks)packages/sdk/src/code-gen.ts
(2 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- packages/ide/jetbrains/CHANGELOG.md
🔇 Additional comments (2)
packages/sdk/src/code-gen.ts (2)
18-18
: Consider the implications of disabling noImplicitAny
Setting noImplicitAny: false
relaxes TypeScript's type checking by allowing implicit any
types. While this might align with broader changes in type handling, it could potentially lead to type-related bugs that would otherwise be caught by the compiler.
#!/bin/bash
# Description: Check for existing implicit any usage in the codebase
# Search for potential implicit any usage in TypeScript files
echo "Files with potential implicit any usage:"
rg --type ts ":\s*any\b" -l
# Check if there are any TypeScript configuration files that might be affected
echo -e "\nTypeScript configurations in the project:"
fd "tsconfig.*json$"
1-2
: Verify the package.json import path
The import path ./package.json
seems unusual. Typically, package.json would be imported from the project root.
✅ Verification successful
The ./package.json
import path is correct
The import path is valid because:
packages/sdk/src/package.json
exists in the same directory ascode-gen.ts
- The SDK package explicitly exports its package.json via the
exports
field inpackages/sdk/package.json
:"./package.json": "./package.json"
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify the location of package.json relative to code-gen.ts
# Find the location of code-gen.ts
code_gen_path=$(fd --type f "code-gen.ts$")
echo "code-gen.ts location: $code_gen_path"
# Check if package.json exists in the same directory
dirname_result=$(dirname "$code_gen_path")
if [ -f "$dirname_result/package.json" ]; then
echo "package.json exists in the same directory"
cat "$dirname_result/package.json" | jq .version
else
echo "package.json not found in the same directory"
fi
Length of output: 6556
Script:
#!/bin/bash
# Let's try a different approach to verify the package.json location and import path
# Find all package.json files in the repository
echo "All package.json files in the repository:"
fd "package.json$"
# Find the exact location of code-gen.ts
echo -e "\nLocation of code-gen.ts:"
fd "code-gen.ts$"
# Check the content of the package.json in the sdk directory
echo -e "\nContent of packages/sdk/package.json (if exists):"
if [ -f "packages/sdk/package.json" ]; then
cat "packages/sdk/package.json" | jq .
fi
Length of output: 2738
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Outside diff range and nitpick comments (2)
packages/schema/src/plugins/enhancer/enhance/index.ts (2)
697-754
: Consider extracting regex patterns to constantsThe method contains multiple regex patterns that are used for type matching. Consider extracting these patterns to named constants at the class level for better maintainability and reusability.
Example:
private static readonly JSON_FIELD_TYPE_PATTERNS = [ 'GroupByOutputType', '(Unchecked)?Create(\\S+?)?Input', '(Unchecked)?Update(\\S+?)?Input', 'CreateManyInput', '(Unchecked)?UpdateMany(Mutation)?Input', ];
793-799
: Consider optimizing string manipulationThe current implementation makes multiple passes over the string. Consider combining the operations:
- private removeFromSource(source: string, text: string) { - source = source.replace(text, ''); - return this.trimEmptyLines(source); - } - - private trimEmptyLines(source: string): string { - return source.replace(/^\s*[\r\n]/gm, ''); - } + private removeFromSource(source: string, text: string) { + return source.replace(text, '').replace(/^\s*[\r\n]/gm, ''); + }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (3)
packages/schema/src/plugins/enhancer/enhance/index.ts
(13 hunks)packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts
(3 hunks)packages/sdk/src/model-meta-generator.ts
(6 hunks)
🔇 Additional comments (9)
packages/sdk/src/model-meta-generator.ts (3)
29-29
: LGTM: Import changes align with function updates
The new imports getAuthDecl
and saveSourceFile
correctly support the refactoring changes described in the summary.
Also applies to: 41-41
104-104
: LGTM: Auth model generation now includes type definitions
The changes correctly integrate type definitions into the auth model generation process. The implementation is clean and follows TypeScript best practices.
Also applies to: 165-169
Line range hint 70-87
: Verify the synchronous file saving behavior
The function has been changed from async to sync, and the file saving mechanism has been updated. While the changes look correct, we should verify:
- That
saveSourceFile
handles any necessary async operations internally - That all callers of this function have been updated to handle the sync behavior
packages/schema/src/plugins/enhancer/policy/policy-guard-generator.ts (2)
29-29
: LGTM: Import addition aligns with synchronous file saving changes.
The addition of saveSourceFile
import is consistent with the refactoring to make file saving synchronous.
Line range hint 61-90
: Verify the synchronous file saving behavior.
The change from async to sync file saving could impact performance or error handling. Please ensure:
- The
saveSourceFile
function is designed for synchronous operation - Error handling is properly maintained
- Performance is not significantly impacted for large files
✅ Verification successful
Based on the results, I can now generate a final response since I have found the implementation of saveSourceFile
. The function is indeed designed for synchronous operation, using saveSync()
internally.
The synchronous file saving change is safe and appropriate
The implementation of saveSourceFile
in packages/sdk/src/code-gen.ts
confirms it's designed for synchronous operation:
- Uses
sourceFile.saveSync()
internally - Performs simple text manipulation and formatting synchronously
- No async operations or error handling patterns found
The change from async to sync is consistent with the underlying implementation and doesn't introduce any risks to error handling or performance.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Check if saveSourceFile is designed for sync operation
ast-grep --pattern 'function saveSourceFile($_) {
$$$
}'
# Check for any error handling patterns in the codebase
rg -A 5 'saveSourceFile.*catch'
Length of output: 994
packages/schema/src/plugins/enhancer/enhance/index.ts (4)
78-78
: LGTM: Generate method changes
The changes to the generate method, including the addition of newPrismaClientDtsPath
to the return type and its implementation, are well-structured and use proper path handling.
Also applies to: 144-149
674-695
: Escape field names in fixDefaultAuthType method
The field names are used directly in regular expressions without escaping. This could cause issues if field names contain special regex characters.
707-714
: Escape field names in replacePrismaJson function
The field name is used directly in a regular expression without escaping.
756-762
: LGTM: Extra types generation
The implementation is clean and straightforward, properly handling typedef generation.
No description provided.