Page cover

AI Integration

Platform Kenesis integrates advanced artificial intelligence (AI) through its LLM Open Module to enhance the learning experience within its decentralized, Web3-based ecosystem. By leveraging a custom-built AI framework and decentralized infrastructure, ensures that its AI-driven learning assistants are secure, scalable, and aligned with the platform’s ethos of transparency, creator empowerment, and trustlessness.

The AI integration includes the following components:

Custom-Built AI Framework for LLM Open Module

The LLM Open Module is powered by a proprietary AI framework designed to enable creators to train course-specific learning assistants using tailored prompts. Creators provide prompts that align with their content such as courses, eBooks, or research papers allowing the Large Language Model (LLM) to deliver contextually relevant responses, clarifications, and guidance. This framework supports dynamic interaction, enabling the AI to answer learner queries, explain complex concepts, and enhance engagement in real-time, all within the course interface.

The following Python example demonstrates training an AI bot with course-specific prompts using the Kenesis AI SDK:

from kenesis_ai_sdk import KenesisAIClient
# Initialize Kenesis AI Client
client = KenesisAIClient(api_key="your-api-key", endpoint="https://api.kenesis.io/v1/ai")

# Train AI bot with prompt
course_id = "course_12345"
prompt = {
    "context": "Blockchain Basics",
    "question": "What is a smart contract?",
    "response": "A self-executing contract with terms written in code."
}
response = client.train_course_bot(course_id, prompt)
print(f"Bot trained: {response['botId']}")

Explanation: This snippet shows how a creator trains an AI bot using the AI SDK, submitting a prompt for a blockchain course. The bot is linked to the course ID, enabling context-aware responses within the course interface, compatible with NFT-based access control.

Decentralized AI Hosting

To maintain consistency with project decentralized ethos, the AI infrastructure is hosted on a distributed network, eliminating reliance on centralized servers. This decentralized hosting ensures that the LLM Open Module operates securely, with data and processing distributed across multiple nodes to enhance availability, resilience, and privacy. By integrating with proprietary decentralized storage solution, the AI framework aligns with the platform’s commitment to a trustless and scam-resistant ecosystem, safeguarding both creator inputs and learner interactions.

The following JavaScript example demonstrates querying a decentralized AI model hosted on Kenesis’s distributed network:

import axios from 'axios';
async function queryDecentralizedBot(botId, query, tokenId, walletAddress) {
  try {
    const response = await axios.post(
      'https://ai.kenesis.io/v1/query',
      { botId, query, tokenId, walletAddress },
      { headers: { Authorization: 'Bearer <JWT_TOKEN>' } }
    );
    return response.data.answer; // e.g., "A smart contract is..."
  } catch (error) {
    console.error('Error querying AI:', error.message);
    throw error;
  }
}
// Query bot for learner
queryDecentralizedBot('bot_67890', 'What is a smart contract?', '123', '0x...');

Explanation: This snippet shows how a learner queries a decentralized AI bot, hosted on distributed network, using a bot ID and NFT token ID for access verification. The API integrates with the decentralized storage to ensure secure, NFT-gated responses, aligning with the `Kenesis’s NFT contract’s access control.

This AI integration empowers creators to deliver personalized and interactive learning experiences while leveraging the security and transparency of decentralized infrastructure. By embedding AI-driven learning assistants within its blockchain-based platform, platform reinforces its mission to revolutionize education and research through innovative, creator-centric, and user-focused technology.

Last updated