GraphGrid is a tool for developers to create graph-enhanced AI solutions for their teams. If you're a developer looking to build a graph-enhanced AI solution, you've come to the right place!
GraphGrid AI contains these core modules to build graph-enhanced AI solutions.
What's inside 📦 ?
GraphGrid uses Docker to run as a containerized application or "package." The package includes the files and framework necessary to run both custom-built GraphGrid images and public Docker images necessary for service integration.
docker-compose.ymlfile contains configurations for the many services in GraphGrid, including environment variables and volume mounts.
binfolder contains start/stop and install scripts.
nlpfolder contains several NLP models for the nlp service (Enterprise Ed. only).
datadirectory contains volume mount destinations, ONgDB plugins, and some setup files.
GraphGrid AI comes in either a full or light package size.
The full packaging size is an all-in-one package that includes everything needed for a standalone deployment. Most notably, it contains all the docker images and models required to run GraphGrid NLP.
Full packages follow the naming convention
The light packaging contains everything except for the docker images. This exclusion greatly reduces the size of the packaging.
GraphGrid AI will use docker images already on the system, or will download the images during its first run.
Light packages follow the naming convention
- Docker Engine
- Docker Compose
- RAM - We recommend allocating at least
20 GBto smoothly run GraphGrid. Using certain services like GraphGrid NLP may require more.
- If using Docker for Mac, we suggest setting the Docker Resources like so:
- CPUs: minimum 5
- Memory: minimum
- Swap: minimum
- Ports - It is required that the following ports are open on the server: