… and preferably with a Web GUI?
For example, solutions like Proxmox are really nice and sophisticated, but only if you're dealing with instances of Proxmox, of course. Of course, you can connect and manage them from a single instance. And it's no wonder that dedicated server companies provide Proxmox's ready-to-use ISOs as well as ESXi as an ideal solution for distributing a dedicated server across multiple virtual machines.
But what happens if you manage virtual machines, containers or dedicated servers spread across different public clouds or hosting providers? For example. KVM-based virtual machines on DigitalOcean / Linode, LXC containers on Proxmox hosts, pure solutions at OVH / Hetzner, etc.? Ideally, without an agent, although the installation of an agent would make my life easier overall.
The goal is to have something easy to install and maintain. Use a single server as a "master" node, and then connect and manage any other slave server (virtual, container, or dedicated) using SSH commands or an agent. A simple web-based GUI could be used to report health status and resource usage and (why not) perhaps run remote shell commands.
There is obviously no need for calculation / storage / network separation.
Until here I watched:
a) orchestration tools such as Ansible, Chef, Puppet, etc. or Kubernetes / Swarm (although intended primarily, if not exclusively, for containers), but they are either limited to the CLI or are really overloaded to set up.
b) OpenStack = shoot me now (sorry OpenStack users)
c) OpenNebula is probably close to such a need, but it also seems a little excessive to set up (but not too complex if you tried Kubernetes).
d) oVirt is very nice, but it requires CentOS as the base operating system (for slave nodes) – it's a problem if a remote virtual machine is already configured with Ubuntu.
Thank you in advance for any suggestions / pointers.