Terms & Conditions

Effective Date: January 16, 2026

These Master Terms & Conditions ("Agreement") govern access to and use of the services provided by oneinfer.ai, a product operated by Lokai Private Limited, having its registered office at Bangalore South, Karnataka – 560061, India ("oneinfer.ai", "Company", "we", "us", "our").

By accessing or using the Services, or by executing an order form, statement of work, or similar commercial agreement referencing this Agreement, you ("Customer", "you") agree to be bound by these Terms.

1. Definitions

"Services" means the oneinfer.ai platform, APIs, unified inference layer, dashboards, documentation, software, and the provisioning of access to underlying compute resources.

"Compute Resources" means the Graphics Processing Units (GPUs) and related hardware infrastructure provided by third-party data centers, decentralized nodes, or cloud providers ("Providers").

"Customer Data" means any data, content, prompts, inputs, outputs, or materials submitted, processed, or generated by Customer through the Services.

"Order Form" means any written or electronic agreement specifying commercial terms, pricing, usage limits, or scope of Services.

"Authorized Users" means individuals authorized by Customer to access the Services.

2. Scope of Services & Infrastructure

2.1 Unified Inference Layer
oneinfer.ai acts as an orchestration layer enabling access, routing, and management of AI inference workloads across heterogeneous Compute Resources.

2.2 Dependence on Third-Party Providers
Customer acknowledges that oneinfer.ai utilizes third-party Providers to supply the underlying Compute Resources. Consequently:

Availability: Access to specific GPU models (e.g., H100, A100) is subject to global supply chain availability and Provider capacity.

No Hardware Exclusivity: Unless explicitly purchased as a "Reserved Instance," Compute Resources are multi-tenant and not exclusive to Customer.

Volatility: oneinfer.ai does not guarantee against hardware failures, thermal throttling, or bit-flip errors inherent to physical silicon.

2.3 Maintenance & Interruptions
Scheduled Maintenance: Providers may require downtime for hardware maintenance. oneinfer.ai will endeavor to route traffic to alternative nodes, but guarantees no uninterrupted service during these windows.

Emergency Maintenance: Providers may perform emergency patching or hardware replacement without notice to protect network integrity.

Pre-emption (Spot Instances): If Customer opts for "Spot" or "Interruptible" pricing tiers, Customer acknowledges that workloads may be terminated immediately if the Provider reclaims capacity.

3. Grant of License

Subject to compliance with this Agreement and payment of applicable fees, oneinfer.ai grants Customer a limited, non-exclusive, non-transferable, non-sublicensable license to access and use the Services solely for Customer's internal business purposes during the applicable subscription term. All rights not expressly granted are reserved.

4. Prohibited Uses & Security Policy

Customer shall not, directly or indirectly, use the Services for any of the following:

4.1 Sensitive & Illegal Content
CSAM: Generating or processing Child Sexual Abuse Material (CSAM), exploitation of minors, or non-consensual sexual content (NCII).

Nudity & Pornography: Unlawful pornography or sexually explicit content that violates local standards or applicable laws in the jurisdiction where the GPU node is physically located.

Violence & Terrorism: Content that promotes, facilitates, or encourages acts of terrorism, violent extremism, self-harm, suicide, or hate speech against protected groups.

Data Scraping: Using the Services to scrape, harvest, or extract data from third-party websites in violation of their Terms of Service, or harvesting PII (Personally Identifiable Information) from public datasets.

4.2 Deceptive AI & Social Engineering
Deepfakes: Creating synthetic audio or video that deceptively mimics a real person's likeness or voice without their explicit written consent.

Social Engineering: Generating content intended to trick third parties into revealing sensitive information (passwords, financial data) or believing they are interacting with a trusted entity (e.g., bank impersonation).

Political Manipulation: Content intended to interfere with elections, spread political disinformation, or impersonate government officials.

4.3 Network Abuse & Hacking
Unauthorized Scanning: Probing, scanning, or testing the vulnerability of any system, network, or subnet (including oneinfer.ai's internal network or other customers' instances) without express written authorization.

Packet Sniffing: Monitoring data or traffic on any network or system without permission (e.g., using tcpdump or wireshark to capture other tenants' traffic).

Spoofing: Forging TCP-IP packet headers, email headers, or any part of a message describing its origin or route to mislead the recipient as to the origin of the traffic.

Denial of Service (DoS): Flooding a target with communications requests so the target either cannot respond to legitimate traffic or responds so slowly that it becomes ineffective.

4.4 High-Risk Regulated Industries
Weapons: Instructions on how to manufacture firearms, explosives, or biological weapons.

Unlicensed Advice: Generating medical diagnoses or legal judgements without a "human-in-the-loop" disclaimer, where such advice could lead to physical harm or financial loss.

5. Customer Responsibilities & Technical Requirements

5.1 Identity Verification & Account Integrity
True Identity: Customer represents that all information provided during registration (including name, email, and corporate affiliation) is accurate. "Cheating" or using false identities to bypass bans, usage limits, or free-tier restrictions is strictly prohibited.

No Account Sharing: Customer shall not share access credentials with third parties or allow "renting out" of their account access.

5.2 Immutable Infrastructure
Host Modification: Customer shall not attempt to access, modify, or configure the underlying operating system, kernel, or hardware drivers of the Compute Resources (e.g., via root SSH).

Stateless Compute: Customer acknowledges that Compute Resources are ephemeral and stateless. oneinfer.ai reserves the right to reboot, re-image, or migrate Compute Resources at any time.

Container Requirement: All software dependencies must be defined within the Customer's container image. oneinfer.ai is not responsible for missing libraries resulting from manual (ad-hoc) changes made via SSH.

Ephemeral Storage: Data written to the local boot disk (e.g., /tmp) will be permanently lost upon instance termination. Customer must use designated Persistent Volumes.

5.3 Observability & Metrics
Permitted Monitoring: Customer may collect application-level metrics solely from within the authorized Container environment using user-space tools.

Prohibited Agents: Customer shall not install system-level monitoring agents or kernel modules (e.g., node_exporter, eBPF) on the host OS.

Resource Consumption: oneinfer.ai reserves the right to throttle processes that destabilize the node via excessive telemetry requests.

5.4 Content Moderation
Right to Screen: oneinfer.ai reserves the right (but not the obligation) to use automated tools to detect prohibited content (e.g., CSAM hashes).

Mandatory Reporting: Customer acknowledges that oneinfer.ai is legally obligated to report certain illegal activities to law enforcement authorities.

5.5 Credential Security & API Key Safety
Duty of Confidentiality: Customer is solely responsible for maintaining the security of all access credentials, API keys, and SSH keys issued by oneinfer.ai.

No Public Exposure: Customer shall not embed API keys in open-source code, public repositories (e.g., GitHub, GitLab), client-side applications, or public forums.

Liability for Leaks: Customer is fully responsible for all fees, usage, and damages incurred under their account, including usage resulting from compromised credentials, until Customer explicitly revokes the compromised credentials via the Platform dashboard or notifies oneinfer.ai support.

5.6 Firewall & Network Configuration
Customer Responsibility: Customer is solely responsible for configuring and maintaining the network security rules, including IP allowlists ("whitelists"), firewall rules, and CIDR block restrictions provided to oneinfer.ai.

Default Deny: Customer acknowledges that enabling the Firewall feature may result in a "Default Deny" posture, blocking all traffic not explicitly allowed. oneinfer.ai is not liable for service interruptions caused by Customer's misconfiguration of these rules (e.g., accidentally blocking their own API traffic).

No Liability for Exposure: If Customer configures the Firewall to allow public access (e.g., whitelisting 0.0.0.0/0 or broad subnets), Customer assumes full responsibility for any unauthorized access, DDoS attacks, or data exfiltration resulting from such exposure.

6. Fees, Billing, and Taxes

Usage-Based Billing: Fees are calculated based on compute usage (e.g., GPU-hours) as measured by oneinfer.ai.

Source of Truth: In the event of a discrepancy, oneinfer.ai's logs shall be the sole source of truth for billing.

Payment Terms: Fees are charged as specified in the Order Form. Payments are non-refundable unless expressly stated otherwise.

Suspension: Failure to pay may result in immediate suspension of Services.

7. Intellectual Property Rights

7.1 oneinfer.ai IP
oneinfer.ai and its licensors retain all right, title, and interest in and to the Services, platform, APIs, software, and documentation.

7.2 Customer Data
Customer retains ownership of Customer Data. Customer grants oneinfer.ai a limited, revocable license to process Customer Data solely to provide, secure, maintain, and improve the Services.

8. Data Protection & Shared Infrastructure

8.1 Security Measures
oneinfer.ai will implement reasonable administrative, technical, and organizational measures to protect Customer Data.

8.2 Processing on Provider Hardware
Customer acknowledges that workloads are processed on hardware owned by third-party Providers. While oneinfer.ai mandates security standards:

Customer accepts the inherent risk of processing sensitive data on third-party infrastructure.

oneinfer.ai shall not be liable for unauthorized physical access to hardware by Provider personnel, provided oneinfer.ai has not been negligent in its selection or encryption protocols.

9. Shared Responsibility Model

The parties acknowledge the following division of responsibilities:

oneinfer.ai / Provider: Physical Hardware, Host OS, Drivers, Orchestration Layer, Physical Security.

Customer: Container Images, Application Code, Data Persistence (Checkpoints), Application-Level Security, Credential Management, Network/Firewall Configuration, and Content Moderation of End-User Outputs.

10. Service Levels & Business Continuity (DORA Framework)

10.1 Service Level Targets
The availability, reliability, and performance targets, including any applicable Recovery Time Objectives (RTO) and Recovery Point Objectives (RPO), are set forth in the applicable Service Level Agreement (SLA). Unless explicitly stated in an Order Form, Services are provided on a "reasonable efforts" basis.

10.2 Business Continuity Plan (BCP)
oneinfer.ai maintains a Business Continuity Plan. Upon Customer's reasonable request (max once per year), oneinfer.ai shall provide a summary of BCP testing results to assist Customer with regulatory compliance (e.g., DORA Article 30).

10.3 Incident Reporting
oneinfer.ai shall notify Customer without undue delay after becoming aware of any Major ICT-Related Incident affecting the integrity or availability of Customer's workloads.

11. Third-Party Services

The Services rely on third-party infrastructure. oneinfer.ai is not responsible for outages, hardware degradation, or changes imposed by third-party Providers that are beyond its reasonable control.

12. Warranty Disclaimer

THE SERVICES ARE PROVIDED "AS IS" AND "AS AVAILABLE."

TO THE MAXIMUM EXTENT PERMITTED BY LAW, oneinfer.ai DISCLAIMS ALL WARRANTIES, INCLUDING:

MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT.

WARRANTIES REGARDING HARDWARE UPTIME, LATENCY, OR ERROR-FREE OPERATION OF PROVIDER GPUS.

13. Limitation of Liability

To the maximum extent permitted by law:

oneinfer.ai shall not be liable for indirect, incidental, special, consequential, or punitive damages, including loss of profits, data, or compute time.

oneinfer.ai's total aggregate liability shall not exceed the fees paid by Customer to oneinfer.ai in the six (6) months preceding the event giving rise to the claim.

14. Indemnification

Customer agrees to indemnify and hold harmless oneinfer.ai and its underlying Providers against any claims arising from: Customer Data (including IP violations, Deepfakes, or Illegal Content). Customer's misuse of the Services, hacking attempts, credential leaks, firewall misconfiguration, or violation of the Prohibited Uses Policy (Clause 4).

15. Term and Termination

This Agreement remains effective until terminated.

Customer may terminate by discontinuing use.

oneinfer.ai may suspend or terminate access for material breach (including AUP violations), non-payment, or legal compliance requirements.

16. Governing Law and Jurisdiction

This Agreement shall be governed by and construed in accordance with the laws of India. The courts of Bangalore, Karnataka shall have exclusive jurisdiction.

17. Force Majeure

Neither party shall be liable for failure to perform resulting from conditions beyond its reasonable control, including fiber optic cable cuts, power grid failures, global hardware supply chain shortages, fires, floods, pandemics, or government acts.

18. Amendments

oneinfer.ai may update this Agreement from time to time. Continued use of the Services after updates constitutes acceptance of the revised terms.

19. Execution & Acceptance

This Agreement is deemed accepted and legally binding upon execution by authorized representatives of both parties or upon Customer's access to or use of the Services, whichever occurs first.

© oneinfer, All rights reserved
Terms
Privacy
Refund
Contact