Welcome to the Rug Watch Dog, an advanced AI-driven platform that helps investors analyze cryptocurrency tokens, especially meme coins ๐๐ฐ, to detect potential "rug pulls" ๐. This project combines cutting-edge machine learning ๐, blockchain data analysis ๐, and chatbot integration ๐ค to enhance security ๐ in the crypto ecosystem Check out the live demo: RugWatchDog
- AI Risk Analysis: Automatically analyze meme coins for risks like insider holding %, sniper wallet activity, and volume anomalies.
- Blockchain Data Fetching: Integrates with APIs (Etherscan, DexScreener) to fetch real-time token and transaction data.
- EdgeDB Database: Stores and retrieves token data and model predictions.
- Eliza Chatbot Integration: Interact with a conversational AI assistant on Discord, Telegram, and Twitter for real-time insights.
- FUD Alerts: Automatically generate social media alerts for high-risk tokens to keep the community informed.
- Customizable AI Models: Train and adapt the AI to detect emerging fraud patterns in the crypto ecosystem.
User Request
โ
โผ
API Layer (src/api/)
โ
โผ
Data Collection Layer
โ
โโโบ Etherscan API
โ (src/data-harvesting/fetcher.ts)
โ
โโโบ DexScreener API
(src/data-harvesting/fetcher.ts)
โ
โผ
Data Processing
โ
โโโบ Token Metrics
โ (src/data-processing/metrics.ts)
โ
โโโบ Data Storage
(src/data-processing/storage.ts)
โ
โผ
ML Analysis
โ
โโโบ Model Prediction
โ (src/training/predictor.ts)
โ
โโโบ Risk Evaluation
(src/training/evaluator.ts)
โ
โผ
Response/Alerts
โ
โโโบ API Response
โ (src/api/routes/)
โ
โโโบ Social Integrations
(src/integrations/)
- Input: User submits a token address for analysis
- Data Collection: System fetches data from multiple sources
- Processing: Raw data is transformed into risk metrics
- Analysis: AI model evaluates the risk factors
- Output: Generates alerts or stores results for training
For more details on each step, see the documentation below.
routes/
: API endpoint handlersanalyze.ts
: Token analysis endpointsmetrics.ts
: Metrics retrievaltokens.ts
: Token management
middleware/
: Request processingauth.ts
: Authentication handling
fetcher.ts
: External API integrationscollector.ts
: Data collection orchestrationchainMonitor.ts
: Blockchain scanningtokenScanner.ts
: Token-specific scanning
metrics.ts
: Token metrics calculationparser.ts
: Raw data parsingstorage.ts
: Data persistence layer
modelPredictor.ts
: Risk prediction logicmodelEvaluator.ts
: Model evaluationmodelTrainer.ts
: Model training pipeline
models/
: Database schemasmigrations/
: Database migrationsconnection.ts
: Database configuration
collect-data.ts
: Training data collectionclean-db.ts
: Database maintenancetrain.ts
: Model training execution
api.ts
: API interfacesdata.ts
: Data structure typesutils.ts
: Helper functions
datasets/
: Training datasetstrained/
: Trained model files
src/chat/index.ts
: Chat interface implementationsrc/clients/index.ts
: Social media client integrationssrc/cache/index.ts
: Performance optimization
The database schema is defined in the dbschema
directory. The schema is specified in the default.esdl
file. You can generate the schema using the EdgeDB CLI with the following command:
edgedb schema generate
Database migrations are managed in the migrations
directory. You can generate migrations using the EdgeDB CLI with the following command:
edgedb migration generate
-
Open EdgeDB Shell: Run the following command in your terminal to open the EdgeDB interactive shell:
edgedb
-
List All Object Types: Use the following EdgeQL command to list all object types (tables) and their properties:
SELECT schema::ObjectType { name, properties: { name, target: { name } } } FILTER .name LIKE 'default::%';
-
Create a New Migration:
- Ensure your
.esdl
files reflect the current desired schema state. - Run the following command to create a new migration:
edgedb migration create
- Ensure your
-
Apply the Migration:
- Run the following command to apply the migration:
edgedb migrate
- Run the following command to apply the migration:
-
** Create a new migration and apply it to the cloud instance and generate the query builder
edgedb migration create
edgedb migrate
edgedb migrate -I mollybeach/rug-watch-dog-db
pnpm generate edgeql-js
- Connect to the EdgeDB instance
edgedb -I mollybeach/rug-watch-dog-db
- Write a SELECT query to check the data
SELECT Token {
address,
name,
symbol,
metrics: {
tokenAddress,
volumeAnomaly,
holderConcentration,
liquidityScore,
priceVolatility,
sellPressure,
marketCapRisk,
bundlerActivity,
accumulationRate,
stealthAccumulation,
suspiciousPattern,
isRugPull,
timestamp,
holders,
totalSupply,
currentPrice,
isHoneyPot
},
price: {
tokenAddress,
price,
liquidity,
volume24h,
marketCap,
timestamp
},
createdAt,
updatedAt
};
SELECT TokenMetrics;
To check the contents of your EdgeDB database, you can use the EdgeDB shell to run a SELECT
query. Hereโs how you can do it:
- Select Data:
Execute a
SELECT
query to retrieve data from theTokenMetrics
table. For example:
SELECT TokenMetrics; SELECT TokenPrices;
4. Login to EdgeDB
```edgeql
ย edgedbย cloudย login
- Connect to Your EdgeDB Instance: Use the edgedb command to connect to your EdgeDB instance. You will need the connection details such as host, port, username, and database name. Hereโs an example command:
edgedb -H your-edgedb-host -P your-port -u your-username -d your-database
or
edgedb -I your-instance-name
Generate the Query builder
pnpm generate edgeql-js
Migrate to the cloud instance
edgedb migrate -I mollybeach/rug-watch-dog-db
To close the EdgeDB shell, type CTRL + D
and press Enter.
git clone https://github.com/mollybeach/rug-watch-dog.git
cd rug-watch-dog
pnpm install
Create a .env
file in the root directory:
# API Keys
ETHERSCAN_API_KEY=your_etherscan_key_here
BSCSCAN_API_KEY=your_bscscan_api_key_here
POLYGONSCAN_API_KEY=your_polygonscan_api_key_here
ALCHEMY_API_KEY=your_alchemy_api_key_here
# RPC Endpoints
ETHEREUM_RPC=https://eth-mainnet.g.alchemy.com/v2/your_alchemy_api_key
BSC_RPC=https://bsc-dataseed1.binance.org
POLYGON_RPC=https://polygon-mainnet.g.alchemy.com/v2/your_alchemy_api_key
# Discord Integration (Optional)
DISCORD_APPLICATION_ID=your_discord_app_id
DISCORD_API_TOKEN=your_discord_bot_token
# OpenRouter AI (Optional)
OPENROUTER_API_KEY=your_openrouter_key
# Twitter Bot Integration (Optional)
TWITTER_USERNAME=your_twitter_username
TWITTER_PASSWORD=your_twitter_password
TWITTER_EMAIL=your_twitter_email
# API URLs
ETHERSCAN_API_URL=https://api.etherscan.io/api
DEX_SCREENER_API_URL=https://api.dexscreener.com/latest/
ALCHEMY_API_URL=https://eth-mainnet.g.alchemy.com/v2/
# Server Configuration
PORT=3000
NODE_ENV=development
Note: DexScreener API does not require an API key but has a rate limit of 300 requests per minute.
- Use โK to generate a command
- Common commands:
pnpm start # Start the server pnpm train # Train the model pnpm collect-data # Collect training data pnpm test # Run tests
- Property Missing Error
Property 'marketCap' does not exist on type '{ volumeAnomaly: boolean; holderConcentration: boolean; liquidityScore: boolean; }'
Fix: Ensure your interfaces match the data structure:
interface TokenMetrics {
volume: number;
holders: number;
liquidity: number;
priceChange24h: number;
buyTxns24h: number;
sellTxns24h: number;
marketCap: number;
totalSupply: number;
currentPrice: number;
isRugPull: boolean;
isHoneyPot: boolean;
timestamp: Date;
}
- Training Data Type Mismatch
Argument of type '{ volumeAnomaly: number; holderConcentration: number; liquidityScore: number; isRugPull: boolean; }[]' is not assignable to parameter of type 'TrainingData[]'
Fix: Make sure your training data includes all required fields:
interface TrainingData {
volumeAnomaly: number;
holderConcentration: number;
liquidityScore: number;
priceVolatility: number;
sellPressure: number;
marketCapRisk: number;
isRugPull: boolean;
}
The model is trained on a diverse dataset including:
- 15 known rug pull tokens (including SQUID, SAFEMOON, LUNA Classic)
- 15 legitimate tokens (including WETH, USDC, UNI)
Training data is collected from:
- Etherscan (holder data, contract info)
- DexScreener (price, volume, liquidity data)
- Analyze a token:
curl -X POST http://localhost:3000/analyze \
-H "Content-Type: application/json" \
-d '{"tokenAddress":"0x..."}'