Skip to content

Only News Mock Website is generated #444

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Steff81 opened this issue Nov 28, 2024 · 1 comment
Closed

Only News Mock Website is generated #444

Steff81 opened this issue Nov 28, 2024 · 1 comment

Comments

@Steff81
Copy link

Steff81 commented Nov 28, 2024

Describe the bug
I use the docker container locally with Ollama. (llama3:latest)
I configured everything with the comment here: #354 (comment)
And i use minicpm-v:latest in combination with it and also copied the models.
Ollama is the newest version installed.

Then starting up backend and frontend without error.
I took a screenshot and put it in and the generation starts. The problem is no matter what I tried, there is only the News Page Mock returned, every time. No error in the console of the container, nothing.

Bildschirmfoto 2024-11-28 um 20 22 06

To Reproduce

  • Simply follow the instruction in the comments and try to generate something.
  • As there are absolute no errors, only that the Mock is returned everytime.

I also tried to set some more variables in .env because i thought maybe something is not working with the SHOULD_MOCK_AI_RESPONSE=false variable.
The I even tried to set IS_PROD=true

Nothing helps.

Normally the .env looks like this.
OPENAI_BASE_URL="http://host.docker.internal:12342/v1"
OPENAI_API_KEY="none"

And the most strange is: As i run this on a mac mini on my desk I hear exactly when the fan turns on and something is happening. So something seems to be send to the ollama llm, but there is some weird stuff happening i cannot explain as only the MOCK is generated.

My last try was even to make sure he will never go into the if case

~/Projekte/screenshot-to-code/backend/routes/generate_code.py:
244 await send_message("chunk", content, variantIndex)
245
246: if 1 == 2:
247 completions = [await mock_completion(process_chunk, input_mode=input_mode)]
248 else:

But even after this and restarting the container..He is only generating the News Mockup page.

Here is the log from the container as to see that there are no errors:
2024-11-28 20:21:16 frontend-1 | [TypeScript] Found 0 errors. Watching for file changes.
2024-11-28 20:21:16 frontend-1 |
2024-11-28 20:21:19 backend-1 | INFO: ('172.18.0.1', 64220) - "WebSocket /generate-code" [accepted]
2024-11-28 20:21:19 backend-1 | INFO: connection open
2024-11-28 20:21:54 backend-1 | INFO: connection closed

I have no more idea :)

@Steff81
Copy link
Author

Steff81 commented Nov 28, 2024

Omg .. i mixed up Ollama and LLM Studio Port...
Sorry.. i will close it...

@Steff81 Steff81 closed this as completed Nov 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant