I am having a script folder with a python script, docker file and also a requirements.txt
My python script is running individually well, locally.
Python script
import os
import time
from fastapi import FastAPI
import uvicorn
app = FastAPI()
SHARED_VOLUME_PATH = '/mnt/shared'
@app.get("/")
def process_data():
data = "From Primary car"
with open(os.path.join(SHARED_VOLUME_PATH, 'shared_data_file.txt'), 'w') as f:
f.write(data)
print("Data processed and stored in shared volume")
response = "Data processed and stored in shared volume"
return response
if __name__ == '__main__':
port = int(os.environ.get("PORT", 8080))
uvicorn.run(app, host="0.0.0.0", port=port)
process_data()
Docker file content is the following
FROM python:3.11
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY main.py .
CMD ["python", "main.py"]
When I run the command
gcloud builds submit --tag europe-west4-docker.pkg.dev/<GCP-project-ID>/artifactory/scar-prim-component:latest
I am getting this error.
ERROR: (gcloud.builds.submit) HTTPError 412: 'us' violates constraint 'constraints/gcp.resourceLocations'
I am not very strong in this area. Please advise what I need to change so that i can push my code as an artifactory component in GCP.
I ran the same script and performed the same steps in a different GCP project and it worked fine and got deployed as well.
If there is an alternate means to build and push to the artifactory, please advise the steps (in detail)