There are several different costs associated with running AI, one of the most fundamental is providing the GPU power needed for inference.To date, organizations that need to provide AI inference have had to run long-running cloud instances or provision hardware on-premises. Today, Google Cloud is previewing a new approach, and it could reshape the landscape of AI application deployment. The Google…