Using Clusters

Creating a Cluster

To create a cluster, the Genvid Bastion must be running. You can use the genvid-bastion script to manage it. The following command starts the minimal services required.

genvid-bastion install --bastionid mybastion --checkmodules --update-global-tfvars --loadconfig
--bastionid mybastion

A unique identifier for your bastion. It must:

  • Be between 3 and 32 characters.
  • Only contain lowercase letters, numbers, or hyphens.
  • Start with a letter.
--checkmodules
Use this option to install new modules if none exist or update the ones already present.
--update-global-tfvars
Use this option to update the global Terraform variables.
--loadconfig
Use this option to load the jobs and logs.

The next step is to open a Bastion-UI website to manage the Clusters.

genvid-bastion monitor

On the Bastion-UI page, you can customize the Bastion name.

  • Modify Bastion name.
  • Click on Update
Choosing a unique name for the Bastion

On the Terraform page:

  • Click Add Config.
  • Enter a unique ID.
  • Choose cluster as the category.
Create a cluster

You can select another backend if needed. For now, you can stick with the default values. Some backends may require configuring variables.

Cloning a Cluster

It is also a good practice to use the cloning function when you want to create different clusters with a similar configuration. You can do so by clicking on the clone button on the Terraform configuration view.

Clone a cluster

Cluster Statuses

After creating a cluster, its status is EMPTY. This means that the cluster needs a module. The cluster statuses are:

  • VOID: The cluster doesn’t exist.
  • EMPTY: Cluster created but without a module.
  • DOWN: Cluster is initialized but resources are empty.
  • UP: Terraform apply procedure has succeeded.
  • BUSY: A command is currently running.
  • ERROR: An error occured when checking the cluster.
  • INVALID: Invalid or unknown status.

Importing a Terraform Module

Before configuring the cluster, you must initialize it with a module. This copies the module template and downloads any modules and plugins it requires.

  • Click Commands.
  • Select the SDK-{version}/basic/basic_cluster module.
  • Click Import module.
Terraform module

You will see an initialization log appear. This step should last only a few seconds.

Terraform Settings

Terraform settings

We use Terraform to build the cluster insfrastructure, so the setting values configure the cluster infrastructure. Click the Settings sub link for your cluster and edit the settings.

For ease of use you can also download the settings on the page to a JSON file. You can drag and drop the edited file on the form to edit multiple settings at once or revert to a previous configuration. Please note that all unsaved configurations will be lost.

Setting Description Default Value
admin_password The administrator password for the Windows machine. The Windows machine is only accessible from the other server instances and from other machines that share the same external IP as yours [1]. 1genvid6
ami_prefix A common prefix for the game AMI. This is the prefix we use in the Save the AMI section. You can change it if you want to experiment with your own AMI. default
ami_version A filter prefix for finding the right version of the AMIs. You can change it if you want to bind your cluster to a different SDK version. Current SDK version
datacenter The datacenter name Nomad uses to group local agents together. default
force_az Force all instances to be configured in a given availability zone. It should be the same as the one in Start the AMI. The availability zone has to be specified in full, such as us-east-1d. If a zone doesn’t have any G2 or C5 instances available, you must change to a zone that does. Currently, there’s no way to know in advance if an instance will be available in a specific zone. none
instance_encoding_count Specifies the number of encoding servers that Terraform should create. 1
instance_encoding_type Specifies the type of AWS instance that Terraform should use for encoding servers. c5.2xlarge
instance_game_count Specifies the number of game servers that Terraform should create. 1
instance_game_type Specifies the type of AWS instance that Terraform should use for game servers. g2.2xlarge
instance_internal_count Specifies the number of internal servers that Terraform should create. 1
instance_internal_type Specifies the type of AWS instance that Terraform should use for internal servers. t2.small
instance_public_count Specifies the number of public servers that Terraform should create. 1
instance_public_type Specifies the type of AWS instance that Terraform should use for public servers. t2.small
instance_server_count Specifies the number of admin servers that Terraform should create. 1
instance_server_type Specifies the type of AWS instance that Terraform should use for admin servers. t2.small
region The AWS region where you want to create your cluster. You can use any region that contains G2 instances. us-east-1
namespace The name of the current deployment project on your cluster. This value is part of the AWS Tags configuration, making it easier to group resources together in the AWS Console. deployment
stage The name of the stage your cluster should belong to. This value is part of the AWS Tags configuration, making it easier to group resources together in the AWS Console. dev
vpc_cidr_block Specifies the CIDR block Terraform uses to create new subnets in a newly created VPC. It works in conjunction with vpc_auto_create. You can switch it to any CIDR block you prefer. 10.0.0.0/16
[1]Future versions of the Cluster will use a bastion host instead for increased security.

The next 2 settings’ values are provided automatically to the Terraform configuration via environment variables. You don’t need to set them manually.

  • cluster is the name of your cluster.
  • bastionid is the name of your Bastion.

The basic_cluster configuration takes care of everything: creating a new VPC, key pair, etc. If you want more control or your permissions don’t let you create an AWS VPC or Roles, we also have the minimal_cluster configuration which lets you provide those elements yourself:

Setting Description Default Value
key_pair_private Specifies the private part of the SSH key used to manage the EC2 instances. The key should look something like this: —–BEGIN RSA PRIVATE KEY—–n… n—–END RSA PRIVATE KEY—–n null
key_pair_public Specifies the public part of the SSH key used to manage the EC2 instances. The key should look something like this: ssh-rsa … null
iam_policy_name_game Specifies the IAM policy that Terraform should attach to game servers. The default value lets Terraform create new policies. null
iam_policy_name_server Specifies the IAM policy that Terraform should attach to Genvid servers. The default value lets Terraform create new policies. null
vpc_id Specifies the VPC ID to use instead of creating a new one, which enables sharing VPCs between clusters. A vpc_id should look something like this: vpc-… null

See the Terraform Configuration Documentation for more information on how to set up these variables globally or locally.

Applying a Terraform Infrastructure

Terraform Apply is the operation that builds the cluster infrastructure.

  1. Click Plan Apply to create an execution plan.
  2. Verify the changes match what you need.
  3. After verifying the changes, click Apply to execute the plan.
Terraform Plan Apply
Terraform Apply

You should see the log starting to appear.

If Terraform fails to build the infrastructure:
  • Check the error message.
  • Update the settings.
  • Apply again.

At the end of this step, there is a cluster running on the cloud. When you check All Configs on the Terraform page, the status is UP. This means that the infrastructure is up, but the Genvid SDK is not yet present in it.

Important

Beginning in version 1.19.0, instances provision in the background after Terraform creates them. You need to wait until an instance registers as a Nomad client in the Cluster UI before using it. In particular, Windows instances may take up to 30 minutes before they are ready.

Note

The public machine IP is not the same as the one used when setting up your AMI.

To access your game machine, use the IP from game_public_ips with the same password set during the AMI setup.

To retrieve the game_public_ips:

  1. Make sure the cluster is UP.
  2. Go to the Commands page of your cluster.
  3. Click the OUTPUT button.

You’ll find the game_public_ips listed in the JSON file.

Setting up the SDK on a Running Cluster

Once a cluster status is UP, you will need to set up the Genvid SDK on it and run your project there.

Follow the SDK in the Cloud guide to do so.

Destroying a Terraform Infrastructure

The terraform destroy command removes all resources from the cluster configuration. You should only destroy a cluster’s Terraform infrastructure when you no longer need to run your project on that cluster. To destroy the Terraform infrastructure, click Plan destroy.

If you’re sure you want to destroy the current Terraform infrastructure, click the Destroy button to confirm.

See Destroy Infrastructure for more information.

Deleting a Cluster

To delete a cluster, you have to destroy the Terraform infrastructure first. Go to All Configs and click the Delete button.

Using Custom Repositories

You can add and remove individual Terraform repositories in the bastion. Each repository contains one or more modules that can be used to instantiate a cluster. Use the following command to list the current repositories:

genvid-clusters module-list

Use the following command to add a new module:

genvid-clusters module-add -u {URL} {name}
  • {URL} can be any source compatible with go-getter, including local files.
  • {name} is the destination folder to this repository on your bastion.

After the URL is cloned in the bastion repository, it will be available as a source under modules/module. See Terraform’s Module Configuration for more details on using modules.

Bastion remembers the origin of each module, so you can update them using the following command:

genvid-clusters module-update [name]

The name is optional. If you don’t provide a name, it will update all repositories.

You can remove a module using the following command:

genvid-clusters module-remove {name}

See also

genvid-clusters
Genvid Cluster-Management script documentation.
Bastion API for Terraform
Bastion API for Terraform.
Terraform’s Module Configuration
Documentation of Modules on Terraform.
go-getter
A library for fetching URL in Go.