Get Started with Virtual Servers
What Virtual Servers are, what makes them unique, and how to deploy them onto CoreWeave Cloud
CoreWeave Cloud's Virtual Servers are highly configurable virtual machines, deployed and managed on CoreWeave Cloud via the easy-to-use Cloud UI or programmatically via the Kubernetes API, enabling anyone to deploy and host applications at scale with high availability.
With powerful CPU- and high-performance NVIDIA GPU-accelerated Virtual Servers featuring unlimited resource configurability, and Virtual Servers are deployed and ready to use in seconds.
Virtual Servers can be deployed in all of CoreWeave's Data Center Regions, allowing for geographic diversity.
With GPU PCI pass-through, there's no GPU virtualization or shared resources.
Virtual Servers come with pre-built Linux distributions, Windows versions, or bring your own image! Use cloud-init at startup for even more customization and control.
Leverage the high performance of CoreWeave Cloud Storage for both the Virtual Server root disk and any shared file system volumes to connect to centralized asset storage.
Blazing fast, flexible networking
Up to 100Gbps internal and external networking speed per instance, for blazing fast data transfers.
Directly attach IP addresses to a Virtual Server network interface, or leverage Load Balancer IPs to control internal and external service access.
While not every use case is appropriately solved using Virtual Servers, there are some things that aren't possible without them!
CoreWeave Virtual Servers run under the same API control plane and use the same storage and networking as your Kubernetes workloads. This provides a single, powerful platform for both stateful and stateless resource management.
Common use cases for CoreWeave Virtual Servers include:
Leverage the performance of CoreWeave's bare metal Cloud, even when running containerized Virtual Machines
CoreWeave Virtual Servers provide all the isolation and control benefits that come with running a workload on a real server. Leveraging GPU PCI pass-through means no GPU virtualization or shared resources on Virtual Servers.
Virtual Servers can be deployed with virtual desktop environments, also known as virtual desktop infrastructures or VDIs, which provide developer workstations running either Linux or Windows. By using applications like Parsec (for Windows machines) and Teradici (for Linux), developers can log in to their workstations to access their work from anywhere!
Don't need a Virtual Server?
If you've determined you don't need a Virtual Server, but want to leverage the performance benefits of running containerized workloads on CoreWeave Cloud, check out our documentation on CoreWeave Kubernetes.