Skip to content

feat: add cache to dojo.js #359

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Dec 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
46 changes: 23 additions & 23 deletions package.json
Original file line number Diff line number Diff line change
@@ -1,25 +1,25 @@
{
"name": "dojo.js",
"scripts": {
"build": "bash ./scripts/build-packages.sh",
"build-examples": "bash ./scripts/build-examples.sh",
"clean": "bash ./scripts/clean.sh",
"prettier-check": "npx prettier --check {packages,examples}",
"prettier": "npx prettier --write .",
"release": "pnpm build && pnpm prettier && npx lerna publish --no-private --force-publish",
"docs": "npx typedoc --out docs",
"prepare": "husky install"
},
"devDependencies": {
"husky": "^9.1.6",
"lerna": "^8.1.5",
"prettier": "^3.3.3",
"tsup": "^8.1.0",
"typedoc": "^0.26.7",
"@typhonjs-typedoc/typedoc-theme-dmt": "^0.2.1",
"typedoc-plugin-coverage": "^3.3.0",
"@commitlint/cli": "^18.4.4",
"@commitlint/config-conventional": "^18.4.4",
"@ianvs/prettier-plugin-sort-imports": "^4.3.1"
}
"name": "dojo.js",
"scripts": {
"build": "bash ./scripts/build-packages.sh",
"build-examples": "bash ./scripts/build-examples.sh",
"clean": "bash ./scripts/clean.sh",
"prettier-check": "npx prettier --check {packages,examples}",
"prettier": "npx prettier --write .",
"release": "pnpm build && pnpm prettier && npx lerna publish --no-private --force-publish",
"docs": "npx typedoc --out docs",
"prepare": "husky install"
},
"devDependencies": {
"husky": "^9.1.6",
"lerna": "^8.1.5",
"prettier": "^3.3.3",
"tsup": "^8.1.0",
"typedoc": "^0.26.7",
"@typhonjs-typedoc/typedoc-theme-dmt": "^0.2.1",
"typedoc-plugin-coverage": "^3.3.0",
"@commitlint/cli": "^18.4.4",
"@commitlint/config-conventional": "^18.4.4",
"@ianvs/prettier-plugin-sort-imports": "^4.3.1"
}
}
77 changes: 73 additions & 4 deletions packages/state/src/recs/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,12 @@ import {
} from "@dojoengine/recs";
import {
Clause,
Entities,
EntityKeysClause,
OrderBy,
PatternMatching,
ToriiClient,
} from "@dojoengine/torii-client";

import { convertValues } from "../utils";

/**
Expand Down Expand Up @@ -144,12 +144,21 @@ export const getEntities = async <S extends Schema>(
orderBy: OrderBy[] = [],
entityModels: string[] = [],
limit: number = 100,
logging: boolean = false
logging: boolean = false,
{
dbConnection,
timestampCacheKey,
}: { dbConnection: IDBDatabase | undefined; timestampCacheKey: string } = {
dbConnection: undefined,
timestampCacheKey: "",
}
) => {
if (logging) console.log("Starting getEntities");
let offset = 0;
let continueFetching = true;

const time = dbConnection ? getCache(timestampCacheKey) : 0;

while (continueFetching) {
const entities = await client.getEntities({
limit,
Expand All @@ -158,11 +167,14 @@ export const getEntities = async <S extends Schema>(
order_by: orderBy,
entity_models: entityModels,
dont_include_hashed_keys: false,
internal_updated_at: time,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

Based on the search results, I can now generate a final response:

The internal_updated_at parameter is not supported in the ToriiClient API

The codebase search reveals that the getEntities method in the ToriiClient API accepts a limited set of parameters: limit, offset, clause, order_by, and entity_models. There is no evidence of internal_updated_at being a supported parameter in any of the implementations or type definitions.

The current implementation might lead to runtime errors or ignored parameters. Consider:

  • Removing the internal_updated_at parameter from the query
  • Implementing the timestamp-based filtering logic at the application level after fetching the entities
  • Or consulting with the backend team to add this functionality if it's crucial for performance
🔗 Analysis chain

Including 'internal_updated_at' parameter in 'client.getEntities'

Adding internal_updated_at: time to the client.getEntities call will fetch entities updated since the cached timestamp. Confirm that the backend supports this parameter and that it effectively reduces redundant data retrieval.

Run the following script to check if internal_updated_at is a valid parameter in the ToriiClient API:

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify 'internal_updated_at' parameter usage in ToriiClient.

rg 'getEntities\(' ./node_modules/@dojoengine/torii-client -A 10

Length of output: 146


Script:

#!/bin/bash
# Let's search for getEntities in the entire codebase
rg "getEntities" -A 5

# Also search for any interface or type definitions related to this
ast-grep --pattern 'interface $_ {
  $$$
  getEntities$$$
  $$$
}'

# And check for any type definitions
ast-grep --pattern 'type $_ = {
  $$$
  getEntities$$$
  $$$
}'

Length of output: 17034

});

console.log("entities", entities);
if (dbConnection) {
await insertEntitiesInDB(dbConnection, entities);
}
Comment on lines +173 to +175
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Inserting entities into IndexedDB when 'dbConnection' is provided

The conditional insertion of entities into IndexedDB using insertEntitiesInDB if dbConnection is available adds offline capability or caching. Ensure that:

  • The insertEntitiesInDB function handles entities correctly.
  • Error handling within insertEntitiesInDB is robust.
  • The transaction logic accounts for potential failures.

Please review and test the insertEntitiesInDB function thoroughly to prevent data inconsistency.


if (logging) console.log(`Fetched ${entities} entities`);
if (logging) console.log(`Fetched entities`, entities);

setEntities(entities, components, logging);

Expand All @@ -172,6 +184,11 @@ export const getEntities = async <S extends Schema>(
offset += limit;
}
}

if (dbConnection) {
const currentTime = Math.floor(Date.now() / 1000);
setCache(currentTime, timestampCacheKey);
}
Comment on lines +188 to +191
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add error handling for localStorage operations

The timestamp caching operations should include error handling for cases where localStorage is unavailable or fails.

    if (dbConnection) {
        const currentTime = Math.floor(Date.now() / 1000);
-       setCache(currentTime, timestampCacheKey);
+       try {
+           setCache(currentTime, timestampCacheKey);
+       } catch (error) {
+           console.warn('Failed to set cache timestamp:', error);
+           // Continue execution as caching is non-critical
+       }
    }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (dbConnection) {
const currentTime = Math.floor(Date.now() / 1000);
setCache(currentTime, timestampCacheKey);
}
if (dbConnection) {
const currentTime = Math.floor(Date.now() / 1000);
try {
setCache(currentTime, timestampCacheKey);
} catch (error) {
console.warn('Failed to set cache timestamp:', error);
// Continue execution as caching is non-critical
}
}

};

/**
Expand Down Expand Up @@ -209,6 +226,7 @@ export const getEvents = async <S extends Schema>(
order_by: orderBy,
entity_models: entityModels,
dont_include_hashed_keys: false,
internal_updated_at: 0,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Review timestamp handling consistency

Both getEvents and getEntitiesQuery set internal_updated_at: 0, while getEntities uses a cached timestamp. This inconsistency might lead to unexpected behavior.

Consider either:

  1. Implementing consistent timestamp caching across all methods
  2. Documenting why these methods behave differently

Also applies to: 305-305

},
historical
);
Expand Down Expand Up @@ -284,6 +302,7 @@ export const getEntitiesQuery = async <S extends Schema>(
order_by: orderBy,
entity_models: entityModels,
dont_include_hashed_keys: false,
internal_updated_at: 0,
});

while (continueFetching) {
Expand Down Expand Up @@ -449,3 +468,53 @@ export const setEntities = async <S extends Schema>(
}
}
};

const setCache = (time: number, timestampCacheKey: string) => {
const timeString = Math.floor(time).toString();
localStorage.setItem(timestampCacheKey, timeString);
};

const getCache = (timestampCacheKey: string) => {
return Number(localStorage.getItem(timestampCacheKey) || 0);
};
Comment on lines +472 to +479
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Enhance cache utility functions with type safety and error handling

The cache utility functions need improvements for robustness and maintainability.

+ const CACHE_KEY_PREFIX = 'dojo_cache_';
+ 
+ interface CacheError extends Error {
+   type: 'storage_unavailable' | 'invalid_data';
+ }

- const setCache = (time: number, timestampCacheKey: string) => {
+ const setCache = (time: number, timestampCacheKey: string): void => {
+   try {
+     if (!timestampCacheKey) {
+       throw new Error('Invalid cache key');
+     }
    const timeString = Math.floor(time).toString();
-   localStorage.setItem(timestampCacheKey, timeString);
+   localStorage.setItem(`${CACHE_KEY_PREFIX}${timestampCacheKey}`, timeString);
+   } catch (error) {
+     const cacheError: CacheError = new Error(
+       `Failed to set cache: ${error.message}`
+     ) as CacheError;
+     cacheError.type = 'storage_unavailable';
+     throw cacheError;
+   }
};

- const getCache = (timestampCacheKey: string) => {
+ const getCache = (timestampCacheKey: string): number => {
+   try {
+     if (!timestampCacheKey) {
+       throw new Error('Invalid cache key');
+     }
-   return Number(localStorage.getItem(timestampCacheKey) || 0);
+   const value = localStorage.getItem(
+     `${CACHE_KEY_PREFIX}${timestampCacheKey}`
+   );
+   return value ? Number(value) : 0;
+   } catch (error) {
+     console.warn(`Failed to get cache: ${error.message}`);
+     return 0;
+   }
};
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const setCache = (time: number, timestampCacheKey: string) => {
const timeString = Math.floor(time).toString();
localStorage.setItem(timestampCacheKey, timeString);
};
const getCache = (timestampCacheKey: string) => {
return Number(localStorage.getItem(timestampCacheKey) || 0);
};
const CACHE_KEY_PREFIX = 'dojo_cache_';
interface CacheError extends Error {
type: 'storage_unavailable' | 'invalid_data';
}
const setCache = (time: number, timestampCacheKey: string): void => {
try {
if (!timestampCacheKey) {
throw new Error('Invalid cache key');
}
const timeString = Math.floor(time).toString();
localStorage.setItem(`${CACHE_KEY_PREFIX}${timestampCacheKey}`, timeString);
} catch (error) {
const cacheError: CacheError = new Error(
`Failed to set cache: ${error.message}`
) as CacheError;
cacheError.type = 'storage_unavailable';
throw cacheError;
}
};
const getCache = (timestampCacheKey: string): number => {
try {
if (!timestampCacheKey) {
throw new Error('Invalid cache key');
}
const value = localStorage.getItem(
`${CACHE_KEY_PREFIX}${timestampCacheKey}`
);
return value ? Number(value) : 0;
} catch (error) {
console.warn(`Failed to get cache: ${error.message}`);
return 0;
}
};


async function insertEntitiesInDB(
db: IDBDatabase,
entities: Entities
): Promise<void> {
return new Promise((resolve, reject) => {
const transaction = db.transaction(["entities"], "readwrite");
const store = transaction.objectStore("entities");

let completed = 0;
let error: Error | null = null;

// Handle transaction completion
transaction.oncomplete = () => {
if (error) {
reject(error);
} else {
resolve();
}
};

transaction.onerror = () => {
reject(transaction.error);
};

// Store each entity
for (const [entityId, data] of Object.entries(entities)) {
const entityData = {
id: entityId,
...data,
};

const request = store.put(entityData);
completed++;

request.onerror = () => {
error = request.error;
};
}
});
}
Comment on lines +481 to +520
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

⚠️ Potential issue

Implementing 'insertEntitiesInDB' function for IndexedDB storage

The insertEntitiesInDB function adds entities to IndexedDB, but there are concerns about error handling and transaction management:

  • Error Propagation: Errors in individual store.put requests are assigned to the error variable but do not affect the transaction unless explicitly handled.
  • Transaction Completion: If a request.onerror occurs, the transaction may still complete successfully unless aborted.

Refactor the function to properly handle errors and ensure the transaction is aborted when a request fails:

async function insertEntitiesInDB(
    db: IDBDatabase,
    entities: Entities
): Promise<void> {
    return new Promise((resolve, reject) => {
        const transaction = db.transaction(["entities"], "readwrite");
        const store = transaction.objectStore("entities");

        transaction.oncomplete = () => resolve();
        transaction.onerror = () => reject(transaction.error);

        for (const [entityId, data] of Object.entries(entities)) {
            const entityData = {
                id: entityId,
                ...data,
            };

            const request = store.put(entityData);

            request.onerror = () => {
                transaction.abort(); // Abort the transaction on error
                reject(request.error);
            };
        }
    });
}

This ensures that any error during the put operations will abort the transaction and reject the promise, preventing partial data writes.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
async function insertEntitiesInDB(
db: IDBDatabase,
entities: Entities
): Promise<void> {
return new Promise((resolve, reject) => {
const transaction = db.transaction(["entities"], "readwrite");
const store = transaction.objectStore("entities");
let completed = 0;
let error: Error | null = null;
// Handle transaction completion
transaction.oncomplete = () => {
if (error) {
reject(error);
} else {
resolve();
}
};
transaction.onerror = () => {
reject(transaction.error);
};
// Store each entity
for (const [entityId, data] of Object.entries(entities)) {
const entityData = {
id: entityId,
...data,
};
const request = store.put(entityData);
completed++;
request.onerror = () => {
error = request.error;
};
}
});
}
async function insertEntitiesInDB(
db: IDBDatabase,
entities: Entities
): Promise<void> {
return new Promise((resolve, reject) => {
const transaction = db.transaction(["entities"], "readwrite");
const store = transaction.objectStore("entities");
transaction.oncomplete = () => resolve();
transaction.onerror = () => reject(transaction.error);
for (const [entityId, data] of Object.entries(entities)) {
const entityData = {
id: entityId,
...data,
};
const request = store.put(entityData);
request.onerror = () => {
transaction.abort(); // Abort the transaction on error
reject(request.error);
};
}
});
}

Loading