chore: Update project directory name to prompt2code
This commit is contained in:
parent
ad8eb978d9
commit
fd6c942ee7
|
@ -75,7 +75,7 @@ To install Devika, follow these steps:
|
|||
|
||||
1. Clone the Devika repository:
|
||||
```bash
|
||||
git clone https://github.com/stitionai/devika.git
|
||||
git clone https://git.telavivmakers.space/ro/prompt2code.git
|
||||
```
|
||||
2. Navigate to the project directory:
|
||||
```bash
|
||||
|
@ -83,7 +83,7 @@ To install Devika, follow these steps:
|
|||
```
|
||||
3. Create a virtual environment and install the required dependencies (you can use any virtual environment manager):
|
||||
```bash
|
||||
uv venv
|
||||
python -m venv venv
|
||||
|
||||
# On macOS and Linux.
|
||||
source .venv/bin/activate
|
||||
|
|
|
@ -1,5 +1,7 @@
|
|||
# Challenge: Generate img2vid using GenAI only!
|
||||
|
||||
To generate a video from images using GenAI, we must first set up and Devika IDE for the TAMI server and then fix it to generate the code for the img2vid task.
|
||||
|
||||
Tech specs:
|
||||
|
||||
Find the Tesla-P40 spec on the TAMI server using the following command:
|
||||
|
|
|
@ -1,27 +1,29 @@
|
|||
version: "3.9"
|
||||
|
||||
services:
|
||||
ollama-service:
|
||||
image: ollama/ollama:latest
|
||||
expose:
|
||||
- 11434
|
||||
ports:
|
||||
- 11434:11434
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "curl -f http://localhost:11434/ || exit 1"]
|
||||
interval: 5s
|
||||
timeout: 30s
|
||||
retries: 5
|
||||
start_period: 30s
|
||||
networks:
|
||||
- devika-subnetwork
|
||||
# ollama is running locally
|
||||
# ollama-service:
|
||||
# image: ollama/ollama:latest
|
||||
# expose:
|
||||
# - 11434
|
||||
# ports:
|
||||
# - 11434:11434
|
||||
# healthcheck:
|
||||
# test: ["CMD-SHELL", "curl -f http://localhost:11434/ || exit 1"]
|
||||
# interval: 5s
|
||||
# timeout: 30s
|
||||
# retries: 5
|
||||
# start_period: 30s
|
||||
# networks:
|
||||
# - devika-subnetwork
|
||||
|
||||
devika-backend-engine:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: devika.dockerfile
|
||||
depends_on:
|
||||
- ollama-service
|
||||
# ollama is running locally
|
||||
# depends_on:
|
||||
# - ollama-service
|
||||
expose:
|
||||
- 1337
|
||||
ports:
|
||||
|
|
Loading…
Reference in New Issue
Block a user