| Age | Commit message (Collapse) | Author |
|
|
|
Signed-off-by: Anestis Bechtsoudis <anestis@census-labs.com>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Signed-off-by: Anestis Bechtsoudis <anestis@census-labs.com>
|
|
Fix a bug for non-gather mode (sort existing blacklist file).
Signed-off-by: Anestis Bechtsoudis <anestis@census-labs.com>
|
|
* While parsing ensure that blacklist stack hashes
entries are sorted since we do interpolation search.
* Improve helper bash script to support both hash
gather mode from an input directory and sort only
mode which sorted existing blacklist file.
Signed-off-by: Anestis Bechtsoudis <anestis@census-labs.com>
|
|
The idea is to skip during fuzzing time already analyzed crashes
without having to transfer entire crash files between target
workspaces. Additionally, same vulnerable library might be
loaded in different address resulting into noise duplicates
(due to PC or ADDR) that can be avoided with stack hash
blacklists.
* New calling argument (-B) exported so that user can provide
a file with blacklisted stack hashes (hex format one per line).
Input file must be sorted (using the provided bash script is
strongly recommended).
* File is parsed during init phase and stored in heap
* When crash is detected for MAC & LINUX arch, stack
hash is checked against the blacklist using a semi-fast
interpolation search against the heap array.
* Stack hash blacklist is working independently of the
unique crashes featue.
* tools/createStackBlacklist.sh script can be used post-
campaign (after initial analysis) to extract stack hashes
from crash files (following HF convention) and create
a sorted blacklist file. Same script can be also used
to sort existing blacklist files.
TODO: Ensure that blacklist file is sorted at init phase
Signed-off-by: Anestis Bechtsoudis <anestis@census-labs.com>
|