I have a job Im trying to run in Localstack. It has a number of steps that execute various scripts, append env to bashrc, and ultimately runs a jar. Steps that set up the env look something to the effect of:
[
{
“Jar”: “command-runner.jar”,
…
“Properties”: “”,
“Args”: [
“bash”,
“-c”,
“source /home/hadoop/.bashrc && echo -e "\export …;" >> /home/hadoop/.bashrc;”
],
“Type”: “CUSTOM_JAR”
},
{
“Jar”: “command-runner.jar”,
…
“Properties”: “”,
“Args”: [
“bash”,
“-c”,
“source /home/hadoop/.bashrc && aws s3 sync s3://…;”
],
“Type”: “CUSTOM_JAR”
},
{
“Jar”: “command-runner.jar”,
…
“Properties”: “”,
“Args”: [
“bash”,
“-c”,
“source /home/hadoop/.bashrc && (cat /home/hadoop/.bashrc && source /home/hadoop/.bashrc && env && …; export… " >> /home/hadoop/.bashrc)”
],
“Type”: “CUSTOM_JAR”
},
…
]
That works fine in AWS, but in Localstack when a script uses a variable from an export that was appended to bashrc, it is empty. If I check the localstack container - there is nothing appended to bashrc and no scripts have been synced from S3. So my questions are:
-
Where do my steps get executed? I do not see containers being spun up to act as my cluster.
-
How can I get this job to run locally without worksarounds like running scripts on the localstack container manually, because that would defeat the purpose of replicating prod-like behavior locally.