I followed the guide https://github.com/localstack/localstack-pro-samples/tree/master/sagemaker-inference
but got this error:
Creating bucket…
Uploading model data to bucket…
Creating model in SageMaker…
Adding endpoint configuration…
Creating endpoint…
Traceback (most recent call last):
File “/Users/tom/prossamples/main.py”, line 134, in
run_regular()
File “/Users/tom/prossamples/main.py”, line 118, in run_regular
deploy_model(test_run)
File “/Users/tom/prossamples/main.py”, line 53, in deploy_model
sagemaker.create_endpoint(EndpointName=f"{ENDPOINT_NAME}{run_id}“, EndpointConfigName=f”{CONFIG_NAME}{run_id}")
File “/Users/tom/prossamples/venv/lib/python3.11/site-packages/botocore/client.py”, line 514, in _api_call
return self._make_api_call(operation_name, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/Users/tom/prossamples/venv/lib/python3.11/site-packages/botocore/client.py”, line 938, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (InternalError) when calling the CreateEndpoint operation (reached max retries: 4): exception while calling sagemaker.CreateEndpoint: Docker not available.
I ran localstack pro in the docker mode, and run python main.py in the virtual environment of pyCharm.
Please give me some support.