Eternal AI
  • The AI layer for the new internet
  • eternals
    • What are Eternals?
    • Specification
    • Proof-of-Compute
  • The new internet, AI-powered
    • Bitcoin, AI-powered
      • Eternals on Bitcoin
      • BitAI Virtual Machine
      • Run a BitAI node
    • Ethereum, AI-powered
    • Solana, AI-powered
  • smart contracts, ai-powered
    • How to use onchain LLM
    • Onchain AI composability - AI Powered Wallet
    • Onchain AI Composability - AI Powered Gaming With Chess
  • neurons
    • What are Neurons?
    • Neuron Device
    • Virtual Neurons
      • Solo Neuron
      • Neuron as a Service
      • Pooled Neuron
  • AI CHAINS
    • What are AI chains?
    • Bittensor and existing concepts
    • Base layer: Bitcoin vs Bittensor
    • AI chains: Bitcoin L2s vs Subnets
    • Apps: Smart contracts vs APIs
  • EAI
    • Utilities
    • Tokenomics
  • fully onchain ai models
    • Architecture
    • Deploy your first fully onchain AI
      • Set up your development environment
      • Create a self-custody wallet
      • Train an AI model in Keras
      • Transform the Keras model to Eternal
      • Send, receive, and trade Eternals
    • Progress
    • Misc
      • Transforming an AI Model into an Eternal
      • Standardized data formats
      • Specification
        • Layers
        • Models
  • Decentralized Inference API
    • API
      • API Key
      • Completions
      • Chat completion
      • Create a dagent
      • Get deposit address
      • Get dagent info
      • Agent Completion
    • Onchain Models
    • Tutorials
      • Build unstoppable Eliza agents
      • Build unstoppable Rig agents
      • Build unstoppable ZerePy agents
      • Decentralized ChatGPT
      • Don't Trust, Verify
      • Adjust your dagent personality
      • Launch on Twitter
      • Chain of thought
      • Build a dagent as a service with EternalAI API
    • Open Source
      • Architecture
      • Installation
Powered by GitBook
On this page
  • 1. Install Miniconda:
  • 2. Clone the repository:
  • 3. Activate Conda Environment:
  • 4. Install Dependencies:
  • 5. Create a .env File:
  • 6. Obtain Inference API Key:
  • 7. Configure the .env File:
  • 8. Run the Application:
  1. Decentralized Inference API
  2. Tutorials

Chain of thought

Demonstration on how to interactive chat with your dagent.

PreviousLaunch on TwitterNextBuild a dagent as a service with EternalAI API

Last updated 5 months ago

1. Install Miniconda:

- Download and install Miniconda by following the instructions here:

2. Clone the repository:

git clone https://github.com/eternalai-org/Eternals 
cd Eternals
git checkout add/react-qa

3. Activate Conda Environment:

- Create and activate a Conda environment for Python 3.10.0:

conda create -n eternalai_dagents_chat python=3.10.0
conda activate eternalai_dagents_chat

4. Install Dependencies:

- Install the required Python packages using pip.

pip install -r requirements.txt

- Compile and build eternal_dagents package

pip install --force-reinstall .

5. Create a .env File:

- In the root of your repository, create a .env file to store your environment variables.

cp .env.example .env

6. Obtain Inference API Key:

7. Configure the .env File:

- Open your .env file and insert your Twitter API key and inference API key in the following format:

# Environment
IS_SANDBOX=0

# for contract based llm
ETERNAL_BACKEND_API=https://api.eternalai.org
ETERNAL_BACKEND_API_APIKEY=your_inference_api_key

8. Run the Application:

- Execute the script to start the application:

python3 toolkit/chat-lite.py 

- Sample output:

- Go to to retrieve your inference API key.

Miniconda Installation.
EternalAI and connect your account