Small organizations, startups, and self-hosted servers face increasing strain from automated web crawlers and AI bots, whose online presence has increased dramatically in the past few years (2024 Impreva, Bad Bot Report). Modern bots evade traditional throttling and can degrade server performance through sheer volume even when they are well-behaved. Current tools which use public, shared blocklists for detection quickly go out of date, with one study indicating that 87% of new attacks are not on such lists (Li et al. 2021, Good Bot, Bad Bot). Our interest is in detecting any mechanical access patterns, whether well behaved or malicious, and distinguishing those from human patterns. We introduce an open source, command line tool, Logrip, and a novel security approach that leverages data visualization and hierarchical IP hashing to analyze historic server event logs, distinguishing human users from automated entities based on access patterns. By aggregating IP activity across subnet classes and applying novel statistical measures related to non-human behavior, our method detects coordinated bot activity and distributed crawling attacks that conventional tools fail to identify. Using a real world case study, we estimate that 80–95% of traffic in our examples originates from AI crawlers, underscoring the need for improved filtering mechanisms. Our tools are made open source to enable small organizations to regulate automated traffic effectively, preserving public human access by mitigating performance degradation. By: Rama Hoetzlein | Founder, Quanta Sciences Presentation Materials Available at: https://ift.tt/U1Ssv8j
source https://www.youtube.com/watch?v=S5DJtN1FDYo
Subscribe to:
Post Comments (Atom)
-
Germany recalled its ambassador to Russia for a week of consultations in Berlin following an alleged hacker attack on Chancellor Olaf Scho...
-
Android’s May 2024 security update patches 38 vulnerabilities, including a critical bug in the System component. The post Android Update ...
No comments:
Post a Comment