We’re currently running on a small 8GB RAM server. But we still need Akeneo and that also needs Elasticsearch.
In order to be able to maintain both of these “probably” more easily, I am currently considering putting everything into DockerContainer.
My experience with Docker is still very low (basically 24h)
I just run an Elasticsearch instance in Docker on the Zammad server to avoid having to reinstall Zammad. But my Akeneo can not connect with this over the external IP.
Now my question:
Docker is indeed a container and actually more to test or “play”
How can I implement a productive environment so that I can update from your Docker container. But my SQL data is not lost?
If that works then I could run the Akeneo like that and it would run 2 Elasticsearch instances but they would not bother.
Please note that this is a very detailed question about docker which, in my opinion, only scratched the Zammad universe. You should be fine if you tell your other docker-tool where to find your elasticsearch container - if needed even via port exposing. *
This mainly hasn’t to do anything with the SQL base. I strongly recommend to use docker-compose for production use, the single docker container is for testing only and, by default, will loose its data upon restarting.
* Please note that exposing elasticsearch to the internet is a very bad idea, because you need authentication. Please ensure you’re not pointing that too unauthenticated hosts/persons, as the search index of Zammad will contain sensitive data.
This file will help you with port exposion:
Technically you can also run two elasticsearch containers on the same host, as long as they’re using different ports. Again, this is quite detailed in docker ranges.