Networking FAQ

Questions and answers about networking, port mapping, and connectivity on oneinfer.ai

How can I open custom ports?

When you deploy an instance, oneinfer.ai automatically assigns a public IP and proxy so that standard ports (SSH, HTTP, HTTPS, Jupyter, etc.) work.

If you need a custom port (e.g. 32001 → 32001), you can request a custom port-mapping using the “Identity Port Map” option. This maps the external port to the same internal port.

Make sure your firewall rules (if any) allow ingress on that port, and confirm with the provider that they allow that port through their network proxy.

Can I bind external port X to internal port Y (port translation)?

Yes — oneinfer.ai supports port-mapping by specifying external and internal ports. For example, you can map 8080 externally to 80 internally.

This flexibility helps when running web services, custom HTTP servers, or any service listening on a nonstandard port inside the container.

Is network traffic encrypted / proxied by default?

All control-plane connections (login, console, metadata) between your browser and oneinfer.ai are over HTTPS/TLS. For instance traffic (SSH, HTTP, WebSocket), encryption depends on your application configuration — oneinfer.ai does not automatically wrap internal traffic.

For sensitive workloads, consider using SSH tunnels, HTTPS, or VPN inside your container.

Will I get a public IP if I rent a GPU instance?

Yes — most offers come with a public IP. The marketplace listing shows whether the instance has a public IP. If not, you can choose offers that explicitly advertise "public IP" or "port proxy" support.

Can I run multiple services on different ports on the same instance?

Yes. You can run multiple services (web server, Jupyter, custom API, etc.) simultaneously as long as they listen on **different ports** inside the container, and you set up proper port-mapping or use the proxy that oneinfer.ai provides.

Be sure to check port availability (some ports may be blocked by provider) and open ports correctly in your container configuration.

Does oneinfer.ai support IPv6?

As of now, oneinfer.ai primarily supports IPv4 for public IP assignments. IPv6 support varies between providers. Check the offer details — if IPv6 is required, prefer offers or providers that explicitly mention IPv6 support.