Files
yaping/plans/yaping-architecture.md
yair aacf975446 Initial commit: yaping - SmokePing-like network latency monitoring tool
Features:
- Multiple probe methods: ICMP (subprocess), TCP connect, HTTP/HTTPS
- No root required
- SQLite storage for measurements
- Beautiful terminal graphs with plotext
- Single-file script with PEP 723 inline dependencies
- CLI interface with rich output

Commands: add, remove, list, enable, disable, probe, run, stats, graph, history, import-config

Run with: uv run yaping.py
2026-01-13 18:20:07 +02:00

20 KiB

Yaping - Yet Another PING

A SmokePing-like network latency monitoring tool written in Python.

Overview

yaping is a lightweight, non-root network monitoring tool that measures latency and packet loss to multiple targets using various probe methods, stores results in SQLite, and displays beautiful terminal graphs.

Key Features

  • Multiple Probe Methods: ICMP (via subprocess), TCP connect, HTTP/HTTPS timing
  • No Root Required: Uses system ping command or pure-Python TCP/HTTP probes
  • SQLite Storage: Persistent storage of all measurements
  • Terminal Graphs: Rich CLI visualization using plotext or similar
  • Flexible Configuration: TOML config file + CLI arguments
  • Single-file Script: Uses PEP 723 inline dependencies with uv run

Architecture

flowchart TB
    subgraph CLI[CLI Interface]
        ADD[yaping add target]
        RUN[yaping run]
        STATS[yaping stats]
        GRAPH[yaping graph]
        LIST[yaping list]
    end

    subgraph Core[Core Modules]
        PROBES[Probe Methods]
        SCHEDULER[Scheduler]
        DB[SQLite Storage]
    end

    subgraph Probes[Probe Types]
        ICMP[ICMP - subprocess ping]
        TCP[TCP Connect]
        HTTP[HTTP/HTTPS Request]
    end

    CLI --> Core
    PROBES --> Probes
    SCHEDULER --> PROBES
    SCHEDULER --> DB
    GRAPH --> DB
    STATS --> DB

File Structure

yaping/
├── yaping.py          # Main single-file script with PEP 723 deps
├── yaping.toml        # Optional configuration file
├── yaping.db          # SQLite database (auto-created)
└── plans/
    └── yaping-architecture.md

Database Schema

-- Targets to monitor
CREATE TABLE targets (
    id INTEGER PRIMARY KEY AUTOINCREMENT,
    name TEXT UNIQUE NOT NULL,
    host TEXT NOT NULL,
    probe_type TEXT NOT NULL DEFAULT 'icmp',  -- icmp, tcp, http
    port INTEGER,                              -- For TCP/HTTP probes
    interval INTEGER DEFAULT 60,              -- Probe interval in seconds
    enabled INTEGER DEFAULT 1,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

-- Measurement results
CREATE TABLE measurements (
    id INTEGER PRIMARY KEY AUTOINCREMENT,
    target_id INTEGER NOT NULL,
    timestamp TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    latency_ms REAL,                          -- NULL if packet lost
    success INTEGER NOT NULL,                 -- 1=success, 0=failure
    error_message TEXT,                       -- Error details if failed
    FOREIGN KEY (target_id) REFERENCES targets(id)
);

-- Index for efficient time-range queries
CREATE INDEX idx_measurements_target_time 
    ON measurements(target_id, timestamp);

CLI Commands

Target Management

# Add targets with different probe methods
uv run yaping.py add google --host google.com --method icmp
uv run yaping.py add cloudflare-dns --host 1.1.1.1 --method tcp --port 53
uv run yaping.py add github-api --host https://api.github.com --method http

# List all targets
uv run yaping.py list

# Remove a target
uv run yaping.py remove google

# Enable/disable a target
uv run yaping.py disable google
uv run yaping.py enable google

Running Probes

# Run continuous monitoring (foreground)
uv run yaping.py run

# Run single probe for all targets
uv run yaping.py probe

# Run with custom interval
uv run yaping.py run --interval 30

Viewing Results

# Show statistics for all targets
uv run yaping.py stats

# Show stats for specific target
uv run yaping.py stats google

# Show stats for last hour/day/week
uv run yaping.py stats --period 1h
uv run yaping.py stats --period 24h
uv run yaping.py stats --period 7d

# Display terminal graph
uv run yaping.py graph

# Graph for specific target
uv run yaping.py graph google

# Graph with custom time range
uv run yaping.py graph --period 24h

Probe Methods

1. ICMP Probe (subprocess)

Uses the system ping command:

  • Linux: ping -c 1 -W 5 <host>
  • macOS: ping -c 1 -W 5000 <host>

Parses output to extract latency.

2. TCP Connect Probe

Pure Python socket connection timing:

import socket
import time

def tcp_probe(host: str, port: int, timeout: float = 5.0) -> float | None:
    start = time.perf_counter()
    try:
        sock = socket.create_connection((host, port), timeout=timeout)
        sock.close()
        return (time.perf_counter() - start) * 1000  # ms
    except (socket.timeout, OSError):
        return None

3. HTTP/HTTPS Probe

Uses httpx for accurate timing:

import httpx
import time

def http_probe(url: str, timeout: float = 5.0) -> float | None:
    start = time.perf_counter()
    try:
        response = httpx.get(url, timeout=timeout, follow_redirects=True)
        return (time.perf_counter() - start) * 1000  # ms
    except httpx.HTTPError:
        return None

Terminal Graph Visualization

Using plotext for terminal graphs:

╭─────────────────────────────────────────────────────────────────────╮
│                     Latency - google (last 1h)                      │
╰─────────────────────────────────────────────────────────────────────╯
    50 ┤                                                               
    45 ┤            ╭─╮                                                
    40 ┤           ╭╯ ╰╮                    ╭╮                          
    35 ┤     ╭─╮  ╭╯   ╰╮                  ╭╯╰╮                         
    30 ┼─────╯ ╰──╯     ╰──────────────────╯  ╰─────────────────       
    25 ┤                                                               
    20 ┤                                                               
       └────────────────────────────────────────────────────────────
        15:00       15:15       15:30       15:45       16:00

Statistics: avg=32.4ms, min=25.1ms, max=48.2ms, loss=0.0%

Multi-target comparison:

╭─────────────────────────────────────────────────────────────────────╮
│                    Latency Comparison (last 1h)                     │
╰─────────────────────────────────────────────────────────────────────╯
    100 ┤                                                              
     80 ┤      ■                    ■                                  
     60 ┤    ■ ■  ■              ■  ■  ■                               
     40 ┼──●─●─●──●──●──●──●──●──●──●──●──●──●──●──●──●──●            
     20 ┤                                                              
        └────────────────────────────────────────────────────────────
         15:00       15:15       15:30       15:45       16:00

● google (avg: 32ms)  ■ cloudflare-dns (avg: 65ms)

PEP 723 Inline Dependencies

#!/usr/bin/env python3
# /// script
# requires-python = ">=3.11"
# dependencies = [
#     "click>=8.1",
#     "httpx>=0.27",
#     "plotext>=5.2",
#     "rich>=13.7",
# ]
# ///

Configuration File (yaping.toml)

[defaults]
interval = 60        # Default probe interval in seconds
timeout = 5          # Default timeout in seconds
database = "~/.local/share/yaping/yaping.db"

[[targets]]
name = "google"
host = "google.com"
method = "icmp"
interval = 30

[[targets]]
name = "cloudflare-dns"
host = "1.1.1.1"
method = "tcp"
port = 53

[[targets]]
name = "github-api"
host = "https://api.github.com/zen"
method = "http"
interval = 120

Implementation Phases

Phase 1: Core Infrastructure

  • SQLite database layer with schema
  • Probe method implementations (ICMP, TCP, HTTP)
  • Basic CLI skeleton with Click

Phase 2: Target Management

  • Add/remove/list/enable/disable commands
  • Configuration file parsing

Phase 3: Monitoring

  • Scheduler for continuous probing
  • Single probe command for testing

Phase 4: Visualization

  • Statistics calculation and display
  • Terminal graphs with plotext
  • Rich formatting for tables and output

Dependencies

Package Purpose
click CLI framework
httpx HTTP/HTTPS probing
plotext Terminal graphs
rich Beautiful terminal output, tables, progress

Usage Example Session

# First run - add some targets
$ uv run yaping.py add google --host google.com
✓ Added target 'google' (icmp -> google.com)

$ uv run yaping.py add cloudflare --host 1.1.1.1 --method tcp --port 443
✓ Added target 'cloudflare' (tcp -> 1.1.1.1:443)

$ uv run yaping.py add github --host https://api.github.com --method http
✓ Added target 'github' (http -> https://api.github.com)

# List targets
$ uv run yaping.py list
┏━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━┳━━━━━━━━━━┓
┃ Name       ┃ Host                     ┃ Method ┃ Interval ┃
┡━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━╇━━━━━━━━━━┩
│ google     │ google.com               │ icmp   │ 60s      │
│ cloudflare │ 1.1.1.1:443              │ tcp    │ 60s      │
│ github     │ https://api.github.com   │ http   │ 60s      │
└────────────┴──────────────────────────┴────────┴──────────┘

# Run monitoring (Ctrl+C to stop)
$ uv run yaping.py run
[15:45:01] google: 28.3ms ✓
[15:45:01] cloudflare: 12.1ms ✓
[15:45:02] github: 145.2ms ✓
...

# View statistics
$ uv run yaping.py stats
┏━━━━━━━━━━━━┳━━━━━━━━━┳━━━━━━━━━┳━━━━━━━━━┳━━━━━━━━━┳━━━━━━━━┓
┃ Target     ┃ Avg     ┃ Min     ┃ Max     ┃ StdDev  ┃ Loss   ┃
┡━━━━━━━━━━━━╇━━━━━━━━━╇━━━━━━━━━╇━━━━━━━━━╇━━━━━━━━━╇━━━━━━━━┩
│ google     │ 29.1ms  │ 25.3ms  │ 45.2ms  │ 4.2ms   │ 0.0%   │
│ cloudflare │ 11.8ms  │ 10.1ms  │ 15.3ms  │ 1.1ms   │ 0.0%   │
│ github     │ 142.3ms │ 135.1ms │ 189.2ms │ 12.4ms  │ 1.2%   │
└────────────┴─────────┴─────────┴─────────┴─────────┴────────┘

# Display graph
$ uv run yaping.py graph google --period 1h

Testing Strategy

Tests will be in a separate file test_yaping.py using pytest with inline PEP 723 dependencies.

Test File Structure

#!/usr/bin/env python3
# /// script
# requires-python = ">=3.11"
# dependencies = [
#     "pytest>=8.0",
#     "pytest-asyncio>=0.23",
#     "click>=8.1",
#     "httpx>=0.27",
#     "plotext>=5.2",
#     "rich>=13.7",
# ]
# ///

Run tests with: uv run test_yaping.py or uv run pytest test_yaping.py -v

Test Categories

1. Unit Tests - Probe Methods

class TestProbes:
    def test_icmp_probe_success(self):
        """Test ICMP probe against localhost"""
        result = icmp_probe("127.0.0.1")
        assert result is not None
        assert result > 0

    def test_icmp_probe_failure(self):
        """Test ICMP probe against invalid host"""
        result = icmp_probe("invalid.host.that.does.not.exist.local")
        assert result is None

    def test_tcp_probe_success(self):
        """Test TCP probe against known open port"""
        result = tcp_probe("1.1.1.1", 53)  # Cloudflare DNS
        assert result is not None
        assert result > 0

    def test_tcp_probe_failure(self):
        """Test TCP probe against closed port"""
        result = tcp_probe("127.0.0.1", 59999)  # Unlikely to be open
        assert result is None

    def test_http_probe_success(self):
        """Test HTTP probe against known endpoint"""
        result = http_probe("https://httpbin.org/get")
        assert result is not None
        assert result > 0

    def test_http_probe_failure(self):
        """Test HTTP probe against invalid URL"""
        result = http_probe("https://invalid.domain.local")
        assert result is None

2. Unit Tests - Database Layer

class TestDatabase:
    @pytest.fixture
    def temp_db(self, tmp_path):
        """Create a temporary database for testing"""
        db_path = tmp_path / "test.db"
        db = Database(str(db_path))
        db.init()
        yield db
        db.close()

    def test_add_target(self, temp_db):
        """Test adding a target"""
        target_id = temp_db.add_target("test", "example.com", "icmp")
        assert target_id is not None
        assert target_id > 0

    def test_add_duplicate_target(self, temp_db):
        """Test adding duplicate target raises error"""
        temp_db.add_target("test", "example.com", "icmp")
        with pytest.raises(Exception):
            temp_db.add_target("test", "other.com", "tcp")

    def test_get_targets(self, temp_db):
        """Test retrieving targets"""
        temp_db.add_target("test1", "example1.com", "icmp")
        temp_db.add_target("test2", "example2.com", "tcp", port=80)
        targets = temp_db.get_targets()
        assert len(targets) == 2

    def test_record_measurement(self, temp_db):
        """Test recording a measurement"""
        target_id = temp_db.add_target("test", "example.com", "icmp")
        temp_db.record_measurement(target_id, 25.5, success=True)
        stats = temp_db.get_stats(target_id)
        assert stats["count"] == 1
        assert stats["avg"] == 25.5

    def test_record_failed_measurement(self, temp_db):
        """Test recording a failed measurement"""
        target_id = temp_db.add_target("test", "example.com", "icmp")
        temp_db.record_measurement(target_id, None, success=False, error="Timeout")
        stats = temp_db.get_stats(target_id)
        assert stats["loss_percent"] == 100.0

    def test_get_measurements_time_range(self, temp_db):
        """Test retrieving measurements with time filter"""
        target_id = temp_db.add_target("test", "example.com", "icmp")
        temp_db.record_measurement(target_id, 25.5, success=True)
        measurements = temp_db.get_measurements(target_id, period="1h")
        assert len(measurements) == 1

3. Integration Tests - CLI Commands

from click.testing import CliRunner

class TestCLI:
    @pytest.fixture
    def runner(self):
        return CliRunner()

    @pytest.fixture
    def isolated_env(self, runner, tmp_path):
        """Run CLI in isolated filesystem with temp database"""
        with runner.isolated_filesystem(temp_dir=tmp_path):
            yield runner

    def test_add_command(self, isolated_env):
        """Test add command"""
        result = isolated_env.invoke(cli, ["add", "test", "--host", "example.com"])
        assert result.exit_code == 0
        assert "Added target" in result.output

    def test_list_command(self, isolated_env):
        """Test list command with no targets"""
        result = isolated_env.invoke(cli, ["list"])
        assert result.exit_code == 0

    def test_list_command_with_targets(self, isolated_env):
        """Test list command with targets"""
        isolated_env.invoke(cli, ["add", "test", "--host", "example.com"])
        result = isolated_env.invoke(cli, ["list"])
        assert result.exit_code == 0
        assert "test" in result.output

    def test_remove_command(self, isolated_env):
        """Test remove command"""
        isolated_env.invoke(cli, ["add", "test", "--host", "example.com"])
        result = isolated_env.invoke(cli, ["remove", "test"])
        assert result.exit_code == 0
        assert "Removed" in result.output

    def test_stats_no_data(self, isolated_env):
        """Test stats command with no measurements"""
        isolated_env.invoke(cli, ["add", "test", "--host", "example.com"])
        result = isolated_env.invoke(cli, ["stats"])
        assert result.exit_code == 0

    def test_probe_command(self, isolated_env):
        """Test single probe command"""
        isolated_env.invoke(cli, ["add", "localhost", "--host", "127.0.0.1"])
        result = isolated_env.invoke(cli, ["probe"])
        assert result.exit_code == 0

4. Unit Tests - Statistics Calculation

class TestStatistics:
    def test_calculate_stats_empty(self):
        """Test stats with no data"""
        stats = calculate_stats([])
        assert stats["count"] == 0
        assert stats["avg"] is None

    def test_calculate_stats_single(self):
        """Test stats with single measurement"""
        stats = calculate_stats([25.0])
        assert stats["count"] == 1
        assert stats["avg"] == 25.0
        assert stats["min"] == 25.0
        assert stats["max"] == 25.0

    def test_calculate_stats_multiple(self):
        """Test stats with multiple measurements"""
        stats = calculate_stats([10.0, 20.0, 30.0])
        assert stats["count"] == 3
        assert stats["avg"] == 20.0
        assert stats["min"] == 10.0
        assert stats["max"] == 30.0

    def test_calculate_loss_percent(self):
        """Test packet loss calculation"""
        # 2 successes, 1 failure
        loss = calculate_loss_percent(successes=2, total=3)
        assert abs(loss - 33.33) < 0.1

5. Unit Tests - Configuration Parsing

class TestConfig:
    def test_parse_config(self, tmp_path):
        """Test parsing TOML config"""
        config_content = '''
[defaults]
interval = 30

[[targets]]
name = "test"
host = "example.com"
method = "icmp"
'''
        config_path = tmp_path / "yaping.toml"
        config_path.write_text(config_content)
        
        config = parse_config(str(config_path))
        assert config["defaults"]["interval"] == 30
        assert len(config["targets"]) == 1

    def test_parse_config_missing(self):
        """Test handling missing config file"""
        config = parse_config("/nonexistent/path/config.toml")
        assert config is None

Test Coverage Goals

Module Target Coverage
Probe methods 90%
Database layer 95%
CLI commands 85%
Statistics 100%
Config parsing 90%

Running Tests

# Run all tests
uv run pytest test_yaping.py -v

# Run with coverage
uv run pytest test_yaping.py --cov=yaping --cov-report=term-missing

# Run specific test category
uv run pytest test_yaping.py -v -k "TestProbes"

# Run only fast tests (skip network)
uv run pytest test_yaping.py -v -m "not network"

Test Markers

# Mark slow/network tests
@pytest.mark.network
def test_real_ping():
    """Tests that require network access"""
    pass

@pytest.mark.slow
def test_long_running():
    """Tests that take a while"""
    pass

Notes

  • The tool is designed as a single Python file for easy distribution
  • All dependencies are specified inline using PEP 723
  • Run with uv run yaping.py - uv handles dependency installation automatically
  • Database defaults to ~/.local/share/yaping/yaping.db but can be overridden
  • Graceful handling of Ctrl+C during monitoring