Running LT in a Docker container

Hi Team,

I am trying to run LT with Ngrams in a Docker container using Amazon Corretto as Base.
I am unable to start the JAVA server when using a Docker.

Here is the error that I get at the last step:
RUN java -cp languagetool-server.jar org.languagetool.server.HTTPServer --port 8081 --allow-origin * --public --languagemodel
—> Running in 2bbafb1fbd4d
Error: Could not find or load main class org.languagetool.server.HTTPServer

Here is the Dockerfile that I am using:
FROM amazoncorretto

RUN yum update -y
RUN yum install sudo -y
RUN sudo yum install java -y
RUN sudo yum install wget -y
RUN wget https://languagetool.org/download/LanguageTool-stable.zip
RUN sudo yum install unzip -y
RUN unzip LanguageTool-stable.zip
RUN cd LanguageTool-4.8/
RUN cd
RUN wget https://languagetool.org/download/ngram-data/ngrams-en-20150817.zip
RUN rm LanguageTool-stable.zip
RUN unzip ngrams-en-20150817.zip
RUN cd LanguageTool-4.8/
RUN java -cp languagetool-server.jar org.languagetool.server.HTTPServer --port 8081 --allow-origin ‘*’ --public --languagemodel

However if I run all the steps in a EC2 instance I am able to start the JAVA server fine.

Thanks in Advance.

You might need to quote * with different quotes here, but I’m not sure, "*" maybe. Also, the language mode parameter is called --languageModel and requires the path to the downloaded ngrams. Which version of Java do you use?

Thank you for your response:

JAVA Version
openjdk 11.0.5 2019-10-15 LTS
OpenJDK Runtime Environment Corretto-11.0.5.10.1 (build 11.0.5+10-LTS)
OpenJDK 64-Bit Server VM Corretto-11.0.5.10.1 (build 11.0.5+10-LTS, mixed mode)

I tried
java -cp languagetool-server.jar org.languagetool.server.HTTPServer --port 8081 --allow-origin “*” --public --languageModel
Same issue.
The strange thing is that the same command works outside the container.

Maybe looking at the existing Docker files might help? GitHub - languagetool-org/languagetool: Style and Grammar Checker for 25+ Languages

Thanks it worked.

@Athak what did your dockerfile eventually look like? I would like to deploy LT on sagemaker endpoint