You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I use the docker container locally with Ollama. (llama3:latest)
I configured everything with the comment here: #354 (comment)
And i use minicpm-v:latest in combination with it and also copied the models.
Ollama is the newest version installed.
Then starting up backend and frontend without error.
I took a screenshot and put it in and the generation starts. The problem is no matter what I tried, there is only the News Page Mock returned, every time. No error in the console of the container, nothing.
To Reproduce
Simply follow the instruction in the comments and try to generate something.
As there are absolute no errors, only that the Mock is returned everytime.
I also tried to set some more variables in .env because i thought maybe something is not working with the SHOULD_MOCK_AI_RESPONSE=false variable.
The I even tried to set IS_PROD=true
And the most strange is: As i run this on a mac mini on my desk I hear exactly when the fan turns on and something is happening. So something seems to be send to the ollama llm, but there is some weird stuff happening i cannot explain as only the MOCK is generated.
My last try was even to make sure he will never go into the if case
But even after this and restarting the container..He is only generating the News Mockup page.
Here is the log from the container as to see that there are no errors:
2024-11-28 20:21:16 frontend-1 | [TypeScript] Found 0 errors. Watching for file changes.
2024-11-28 20:21:16 frontend-1 |
2024-11-28 20:21:19 backend-1 | INFO: ('172.18.0.1', 64220) - "WebSocket /generate-code" [accepted]
2024-11-28 20:21:19 backend-1 | INFO: connection open
2024-11-28 20:21:54 backend-1 | INFO: connection closed
I have no more idea :)
The text was updated successfully, but these errors were encountered:
Describe the bug
I use the docker container locally with Ollama. (llama3:latest)
I configured everything with the comment here: #354 (comment)
And i use minicpm-v:latest in combination with it and also copied the models.
Ollama is the newest version installed.
Then starting up backend and frontend without error.
I took a screenshot and put it in and the generation starts. The problem is no matter what I tried, there is only the News Page Mock returned, every time. No error in the console of the container, nothing.
To Reproduce
I also tried to set some more variables in .env because i thought maybe something is not working with the SHOULD_MOCK_AI_RESPONSE=false variable.
The I even tried to set IS_PROD=true
Nothing helps.
Normally the .env looks like this.
OPENAI_BASE_URL="http://host.docker.internal:12342/v1"
OPENAI_API_KEY="none"
And the most strange is: As i run this on a mac mini on my desk I hear exactly when the fan turns on and something is happening. So something seems to be send to the ollama llm, but there is some weird stuff happening i cannot explain as only the MOCK is generated.
My last try was even to make sure he will never go into the if case
~/Projekte/screenshot-to-code/backend/routes/generate_code.py:
244 await send_message("chunk", content, variantIndex)
245
246: if 1 == 2:
247 completions = [await mock_completion(process_chunk, input_mode=input_mode)]
248 else:
But even after this and restarting the container..He is only generating the News Mockup page.
Here is the log from the container as to see that there are no errors:
2024-11-28 20:21:16 frontend-1 | [TypeScript] Found 0 errors. Watching for file changes.
2024-11-28 20:21:16 frontend-1 |
2024-11-28 20:21:19 backend-1 | INFO: ('172.18.0.1', 64220) - "WebSocket /generate-code" [accepted]
2024-11-28 20:21:19 backend-1 | INFO: connection open
2024-11-28 20:21:54 backend-1 | INFO: connection closed
I have no more idea :)
The text was updated successfully, but these errors were encountered: