I have a Cloud Run service which I would like to connect to my default VPC network through a Serverless VPC connector. The final goal with this is to allow my Cloud Run instances to access an on-prem server through a separately configured VPN, but my first issue is that I cannot even access a VM on the same network.
I have tried debugging this issue using this vpc-network-tester service, which confirms that I cannot ping my Compute Engine VM.
From my VM, I am able to successfully ping the Cloud Run instances (I've configured and set up some simple firewall logging to find the internal IP of the instances, and the VM is able to reach them).
From everything I have read, it sounds like this would be a firewall issue on the Compute Engine VM or default network side of things, but the only additional firewall rules I have set up are ALLOW rules to let me inspect firewall traffic.
Here is some information on the configuration, but please let me know if there is additional info I can provide.
Serverless VPC access
(I did have my own subnet set up for this originally, but I removed that and just allowed it to manage it on its own with an IP range)
Name | Network | IP address range | Region | Instance type |
---|---|---|---|---|
serverless-vpc-connector | default | 10.1.0.0/28 | us-central1 | e2-micro |
Cloud Run Configuration (for vpc-network-tester)
VPC Connector | Route Type |
---|---|
serverless-vpc-connector | Route only requests to private IPs through the VPC connector |
Ping Response
Source | Destination | Result |
---|---|---|
10.1.0.2 (vpc-network-tester) |
10.128.0.2 (internal IP of Compute Engine VM) |
PING 10.128.0.2 (10.128.0.2): 56 data bytes |
10.128.0.2 (internal IP of Compute Engine VM) |
10.1.0.2 (vpc-network-tester) |
PING 10.1.0.2 (10.1.0.2) 56(84) bytes of data. |
Connectivity Test