Skip to content

The AI Rug Checker ๐Ÿค– is a platform that uses AI and blockchain data analysis to assess cryptocurrency risks, especially meme coins ๐Ÿ•๐Ÿ’ฐ, helping investors detect potential "rug pulls." With machine learning and chatbot interaction, it delivers real-time insights, promoting transparency and security in the crypto space

License

Notifications You must be signed in to change notification settings

mollybeach/rug-watch-dog

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

85 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿ• Rug Watch Dog ๐Ÿค–๐Ÿ”—

Welcome to the Rug Watch Dog, an advanced AI-driven platform that helps investors analyze cryptocurrency tokens, especially meme coins ๐Ÿ•๐Ÿ’ฐ, to detect potential "rug pulls" ๐Ÿ›‘. This project combines cutting-edge machine learning ๐Ÿ“Š, blockchain data analysis ๐Ÿ”—, and chatbot integration ๐Ÿค to enhance security ๐Ÿ”’ in the crypto ecosystem Check out the live demo: RugWatchDog

Rug Watch Dog

๐ŸŒŸ Features

  • AI Risk Analysis: Automatically analyze meme coins for risks like insider holding %, sniper wallet activity, and volume anomalies.
  • Blockchain Data Fetching: Integrates with APIs (Etherscan, DexScreener) to fetch real-time token and transaction data.
  • EdgeDB Database: Stores and retrieves token data and model predictions.
  • Eliza Chatbot Integration: Interact with a conversational AI assistant on Discord, Telegram, and Twitter for real-time insights.
  • FUD Alerts: Automatically generate social media alerts for high-risk tokens to keep the community informed.
  • Customizable AI Models: Train and adapt the AI to detect emerging fraud patterns in the crypto ecosystem.

๐Ÿ”„ Application Flow

User Request
    โ”‚
    โ–ผ
API Layer (src/api/)
    โ”‚
    โ–ผ
Data Collection Layer
    โ”‚
    โ”œโ”€โ–บ Etherscan API
    โ”‚   (src/data-harvesting/fetcher.ts)
    โ”‚
    โ””โ”€โ–บ DexScreener API
        (src/data-harvesting/fetcher.ts)
    โ”‚
    โ–ผ
Data Processing
    โ”‚
    โ”œโ”€โ–บ Token Metrics
    โ”‚   (src/data-processing/metrics.ts)
    โ”‚
    โ””โ”€โ–บ Data Storage
        (src/data-processing/storage.ts)
    โ”‚
    โ–ผ
ML Analysis
    โ”‚
    โ”œโ”€โ–บ Model Prediction
    โ”‚   (src/training/predictor.ts)
    โ”‚
    โ””โ”€โ–บ Risk Evaluation
        (src/training/evaluator.ts)
    โ”‚
    โ–ผ
Response/Alerts
    โ”‚
    โ”œโ”€โ–บ API Response
    โ”‚   (src/api/routes/)
    โ”‚
    โ””โ”€โ–บ Social Integrations
        (src/integrations/)

Process Explanation:

  1. Input: User submits a token address for analysis
  2. Data Collection: System fetches data from multiple sources
  3. Processing: Raw data is transformed into risk metrics
  4. Analysis: AI model evaluates the risk factors
  5. Output: Generates alerts or stores results for training

For more details on each step, see the documentation below.

๐Ÿ”ง Technical Architecture

1. API Layer (src/api/)

  • routes/: API endpoint handlers
    • analyze.ts: Token analysis endpoints
    • metrics.ts: Metrics retrieval
    • tokens.ts: Token management
  • middleware/: Request processing
    • auth.ts: Authentication handling

2. Data Collection (src/data-harvesting/)

  • fetcher.ts: External API integrations
  • collector.ts: Data collection orchestration
  • chainMonitor.ts: Blockchain scanning
  • tokenScanner.ts: Token-specific scanning

3. Data Processing (src/data-processing/)

  • metrics.ts: Token metrics calculation
  • parser.ts: Raw data parsing
  • storage.ts: Data persistence layer

4. Machine Learning (src/training/)

  • modelPredictor.ts: Risk prediction logic
  • modelEvaluator.ts: Model evaluation
  • modelTrainer.ts: Model training pipeline

5. Database Layer (src/db/)

  • models/: Database schemas
  • migrations/: Database migrations
  • connection.ts: Database configuration

6. Monitoring & Scripts (src/scripts/)

  • collect-data.ts: Training data collection
  • clean-db.ts: Database maintenance
  • train.ts: Model training execution

7. Types & Utils (src/types/, src/utils/)

  • api.ts: API interfaces
  • data.ts: Data structure types
  • utils.ts: Helper functions

8. Model Storage (/models/)

  • datasets/: Training datasets
  • trained/: Trained model files

9. Integration Layer

  • src/chat/index.ts: Chat interface implementation
  • src/clients/index.ts: Social media client integrations
  • src/cache/index.ts: Performance optimization

๐Ÿ”ง Database Schema

The database schema is defined in the dbschema directory. The schema is specified in the default.esdl file. You can generate the schema using the EdgeDB CLI with the following command:

edgedb schema generate

๐Ÿ”ง Database Migrations

Database migrations are managed in the migrations directory. You can generate migrations using the EdgeDB CLI with the following command:

edgedb migration generate

Using EdgeDB Shell

  1. Open EdgeDB Shell: Run the following command in your terminal to open the EdgeDB interactive shell:

    edgedb
  2. List All Object Types: Use the following EdgeQL command to list all object types (tables) and their properties:

    SELECT schema::ObjectType {
        name,
        properties: {
            name,
            target: {
                name
            }
        }
    } FILTER .name LIKE 'default::%';

Step-by-Step Guide for Migrations

  1. Create a New Migration:

    • Ensure your .esdl files reflect the current desired schema state.
    • Run the following command to create a new migration:
      edgedb migration create
  2. Apply the Migration:

    • Run the following command to apply the migration:
      edgedb migrate
  3. ** Create a new migration and apply it to the cloud instance and generate the query builder

     edgedb migration create 
     edgedb migrate
     edgedb migrate -I mollybeach/rug-watch-dog-db
     pnpm generate edgeql-js
  1. Connect to the EdgeDB instance
     edgedb -I mollybeach/rug-watch-dog-db
  1. Write a SELECT query to check the data
    SELECT Token {
    address,
    name,
    symbol,
    metrics: {
        tokenAddress,
        volumeAnomaly,
        holderConcentration,
        liquidityScore,
        priceVolatility,
        sellPressure,
        marketCapRisk,
        bundlerActivity,
        accumulationRate,
        stealthAccumulation,
        suspiciousPattern,
        isRugPull,
        timestamp,
        holders,
        totalSupply,
        currentPrice,
        isHoneyPot
    },
    price: {
        tokenAddress,
        price,
        liquidity,
        volume24h,
        marketCap,
        timestamp
    },
    createdAt,
    updatedAt
};

SELECT TokenMetrics;

Checking Data in EdgeDB

To check the contents of your EdgeDB database, you can use the EdgeDB shell to run a SELECT query. Hereโ€™s how you can do it:

  • Select Data: Execute a SELECT query to retrieve data from the TokenMetrics table. For example:

SELECT TokenMetrics; SELECT TokenPrices;


4. Login to EdgeDB
```edgeql
ย edgedbย cloudย login
  1. Connect to Your EdgeDB Instance: Use the edgedb command to connect to your EdgeDB instance. You will need the connection details such as host, port, username, and database name. Hereโ€™s an example command:
edgedb -H your-edgedb-host -P your-port -u your-username -d your-database

or

edgedb -I your-instance-name

Generate the Query builder

pnpm generate edgeql-js

Migrate to the cloud instance

edgedb migrate -I mollybeach/rug-watch-dog-db

To close the EdgeDB shell, type CTRL + D and press Enter.


๐Ÿ› ๏ธ Setup

1. Clone the Repository

git clone https://github.com/mollybeach/rug-watch-dog.git
cd rug-watch-dog

2. Install Dependencies

pnpm install

3. Set Up Environment Variables

Create a .env file in the root directory:

# API Keys
ETHERSCAN_API_KEY=your_etherscan_key_here
BSCSCAN_API_KEY=your_bscscan_api_key_here
POLYGONSCAN_API_KEY=your_polygonscan_api_key_here
ALCHEMY_API_KEY=your_alchemy_api_key_here

# RPC Endpoints
ETHEREUM_RPC=https://eth-mainnet.g.alchemy.com/v2/your_alchemy_api_key
BSC_RPC=https://bsc-dataseed1.binance.org
POLYGON_RPC=https://polygon-mainnet.g.alchemy.com/v2/your_alchemy_api_key

# Discord Integration (Optional)
DISCORD_APPLICATION_ID=your_discord_app_id
DISCORD_API_TOKEN=your_discord_bot_token

# OpenRouter AI (Optional)
OPENROUTER_API_KEY=your_openrouter_key

# Twitter Bot Integration (Optional)
TWITTER_USERNAME=your_twitter_username
TWITTER_PASSWORD=your_twitter_password
TWITTER_EMAIL=your_twitter_email

# API URLs
ETHERSCAN_API_URL=https://api.etherscan.io/api
DEX_SCREENER_API_URL=https://api.dexscreener.com/latest/
ALCHEMY_API_URL=https://eth-mainnet.g.alchemy.com/v2/

# Server Configuration
PORT=3000
NODE_ENV=development

Note: DexScreener API does not require an API key but has a rate limit of 300 requests per minute.

4. Quick Commands

  • Use โŒ˜K to generate a command
  • Common commands:
    pnpm start        # Start the server
    pnpm train        # Train the model
    pnpm collect-data # Collect training data
    pnpm test         # Run tests

๐Ÿ”ง Troubleshooting

Common TypeScript Errors

  1. Property Missing Error
Property 'marketCap' does not exist on type '{ volumeAnomaly: boolean; holderConcentration: boolean; liquidityScore: boolean; }'

Fix: Ensure your interfaces match the data structure:

interface TokenMetrics {
  volume: number;
  holders: number;
  liquidity: number;
  priceChange24h: number;
  buyTxns24h: number;
  sellTxns24h: number;
  marketCap: number;
  totalSupply: number;
  currentPrice: number;
  isRugPull: boolean;
  isHoneyPot: boolean;
  timestamp: Date;
}
  1. Training Data Type Mismatch
Argument of type '{ volumeAnomaly: number; holderConcentration: number; liquidityScore: number; isRugPull: boolean; }[]' is not assignable to parameter of type 'TrainingData[]'

Fix: Make sure your training data includes all required fields:

interface TrainingData {
  volumeAnomaly: number;
  holderConcentration: number;
  liquidityScore: number;
  priceVolatility: number;
  sellPressure: number;
  marketCapRisk: number;
  isRugPull: boolean;
}

๐Ÿ“Š Model Training

The model is trained on a diverse dataset including:

  • 15 known rug pull tokens (including SQUID, SAFEMOON, LUNA Classic)
  • 15 legitimate tokens (including WETH, USDC, UNI)

Training data is collected from:

  • Etherscan (holder data, contract info)
  • DexScreener (price, volume, liquidity data)

๐Ÿš€ Usage

  1. Analyze a token:
curl -X POST http://localhost:3000/analyze \
  -H "Content-Type: application/json" \
  -d '{"tokenAddress":"0x..."}'

About

The AI Rug Checker ๐Ÿค– is a platform that uses AI and blockchain data analysis to assess cryptocurrency risks, especially meme coins ๐Ÿ•๐Ÿ’ฐ, helping investors detect potential "rug pulls." With machine learning and chatbot interaction, it delivers real-time insights, promoting transparency and security in the crypto space

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published