How to use onchain LLM
In this code example, we’ll demonstrate how to interact with a LLM model deployed on multiple blockchains, in a few easy steps using TypeScript.
Step 0
Obtain chain (id) and model (id) from Onchain Models page. (you will need these two information to run the following code in Step 3 below).
Step 1
Call the infer
function of the HybridModel
contract with an inference prompt.
Step 2
Periodically check with the PromptScheduler
contract to retrieve the response returned by the LLM model’s miners using the inferId
.
Step 3
Complete example code can be found at: https://github.com/eternalai-org/eternal-ai/blob/master/examples/UniverseDagents/scripts/sendUniverseAgentRequest.ts
You can run the code with the following command:
Note: replace <YOUR_KEY> with your actual wallet private key and the wallet should have some ETH on Base network for paying network fee.
Examples of the prompt request and the miner’s response transaction hashes can be found here:
Prompt tx: https://basescan.org/tx/0x641c26eff85f9486dace2d4ac0558b3c8da576b9e3e79773bb94a817fb8db45c
Response tx: https://basescan.org/tx/0x133b94f00d31908c4b88b8d1b2602a1241bac7a7ee744c04a248ea3e6f47fc75
Last updated