Infera Nodes

Harness the Infera network for scalable, secure, and cost-effective inference.

Network Architecture\\

Our node network distributes work— just make a regular API request

A \

The API Gateway receives inference and read requests from API users

B \

The API Gateway balances the requests and distributes them to Infera nodes

C \

Nodes perform inference and verify the responses among themselves

D \

The results are passed to results database for storage and retrieval

Inference Network Diagram
How nodes works\\

Nodes accept and complete inference requests which are verified by other nodes.

A \

Nodes listen for inference requests broadcasted from the load balancer

B \

When a job is received, the input is passed to the nodes inference engine

C \

Post-inference, the results are verified with similar outputs from reference nodes

D \

Results are then routed back to the load balancer along side the verification results

Node Diagram